An AI-powered tool that automatically converts AWS Lambda Java functions to Google Cloud Functions with intelligent code transformation.
- Automatic Handler Detection: Detects and maps AWS Lambda handler signatures to GCP Cloud Functions entry points
- AI-Powered SDK Analysis: Uses OpenAI to analyze AWS SDK calls and suggest GCP equivalent services
- Build Configuration Migration: Converts Maven
pom.xmlto Gradlebuild.gradlewith proper GCP dependencies
pip install lambda-to-gcf-migratorgit clone https://github.com/example/lambda-to-gcf-migrator.git
cd lambda-to-gcf-migrator
pip install -e .from lambda_to_gcf_migrator import LambdaToGCFMigrator
# Initialize the migrator with your OpenAI API key
migrator = LambdaToGCFMigrator(openai_api_key="your-api-key")
# Migrate a Lambda function
result = migrator.migrate(
source_path="./my-lambda-function",
output_path="./my-gcf-function"
)
print(f"Migration completed: {result.success}")
print(f"Files generated: {result.generated_files}")from lambda_to_gcf_migrator.handler_detector import HandlerDetector
detector = HandlerDetector()
java_code = '''
package com.example;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;
public class MyHandler implements RequestHandler<Map<String, Object>, String> {
@Override
public String handleRequest(Map<String, Object> input, Context context) {
return "Hello from Lambda!";
}
}
'''
handlers = detector.detect_handlers(java_code)
for handler in handlers:
print(f"Found handler: {handler.class_name}.{handler.method_name}")
print(f"Input type: {handler.input_type}")
print(f"Output type: {handler.output_type}")| Variable | Description | Required |
|---|---|---|
OPENAI_API_KEY |
OpenAI API key for AI-powered analysis | Yes |
OPENAI_MODEL |
Model to use (default: gpt-4) |
No |
LOG_LEVEL |
Logging level (default: INFO) |
No |
migrator = LambdaToGCFMigrator(
openai_api_key="your-api-key",
model="gpt-4",
preserve_comments=True,
generate_tests=True,
target_java_version="11"
)The migrator automatically maps common AWS services to their GCP equivalents:
| AWS Service | GCP Equivalent |
|---|---|
| S3 | Cloud Storage |
| DynamoDB | Firestore / Bigtable |
| SQS | Cloud Pub/Sub |
| SNS | Cloud Pub/Sub |
| Lambda | Cloud Functions |
| API Gateway | Cloud Endpoints / API Gateway |
| CloudWatch | Cloud Logging / Monitoring |
| Secrets Manager | Secret Manager |
| KMS | Cloud KMS |
package com.example;
import com.amazonaws.services.lambda.runtime.*;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
public class S3Handler implements RequestHandler<S3Event, String> {
private final AmazonS3 s3 = AmazonS3ClientBuilder.defaultClient();
@Override
public String handleRequest(S3Event event, Context context) {
String bucket = event.getRecords().get(0).getS3().getBucket().getName();
String key = event.getRecords().get(0).getS3().getObject().getKey();
context.getLogger().log("Processing: " + key);
S3Object obj = s3.getObject(bucket, key);
// Process object...
return "Processed: " + key;
}
}package com.example;
import com.google.cloud.functions.BackgroundFunction;
import com.google.cloud.functions.Context;
import com.google.cloud.storage.Storage;
import com.google.cloud.storage.StorageOptions;
import com.google.cloud.storage.Blob;
import com.google.events.cloud.storage.v1.StorageObjectData;
import java.util.logging.Logger;
public class S3Handler implements BackgroundFunction<StorageObjectData> {
private static final Logger logger = Logger.getLogger(S3Handler.class.getName());
private final Storage storage = StorageOptions.getDefaultInstance().getService();
@Override
public void accept(StorageObjectData event, Context context) {
String bucket = event.getBucket();
String name = event.getName();
logger.info("Processing: " + name);
Blob blob = storage.get(bucket, name);
// Process object...
logger.info("Processed: " + name);
}
}pytest tests/ -vblack src/ tests/
isort src/ tests/mypy src/Contributions are welcome! Please read our Contributing Guide for details.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'feat: Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- OpenAI for providing the AI capabilities
- Google Cloud team for Cloud Functions documentation
- AWS team for Lambda documentation