Overview
The Iceberg connector enables to access Iceberg tables on the Glue Data Catalog from your Glue ETL jobs. You can do operations supported by Apache Iceberg such as DDLs, read/write-data, time-travels and streaming writes for the Iceberg tables with Glue jobs.
Highlights
- Create Iceberg tables on the Glue Data Catalog via DDLs and DataFrame.
- Operate queries, time-travels and streaming writes for Iceberg tables.
Details
Features and programs
Financing for AWS Marketplace purchases
Pricing
Vendor refund policy
No Refunds
Legal
Vendor terms and conditions
Content disclaimer
Delivery details
Glue 4.0
- Amazon ECS
- Amazon EKS
Container image
Containers are lightweight, portable execution environments that wrap server application software in a filesystem that includes everything it needs to run. Container applications run on supported container runtimes and orchestration services, such as Amazon Elastic Container Service (Amazon ECS) or Amazon Elastic Kubernetes Service (Amazon EKS). Both eliminate the need for you to install and operate your own container orchestration software by managing and scheduling containers on a scalable cluster of virtual machines.
Version release notes
New version of the container image
Additional details
Usage instructions
Please subscribe to the product from AWS Marketplace and Activate the Glue connector from AWS Glue Studio .
The Iceberg connector enables you to access iceberg tables from your Glue jobs. In particular, you can operate DDLs, read/write data, time-travels and streaming writes for the Iceberg sources. Further details about the Apache Iceberg, please see https://iceberg.apache.org/ .
Using the Apache Iceberg Connector
Here's the steps to set up the Iceberg connector. For more details about using Iceberg on AWS, please refer to https://iceberg.apache.org/aws/ .
Required Spark configuration
Before using this connector, you need to set the following Spark configuration for communication between this connector and the Glue Data Catalog.
You can set the configuration on the job parameter in your Glue job or on the scripts in your Glue job. Please note that the configuration related to spark.sql.extensions is optional, however that's recommended for accessing full operations of Iceberg.
Set on the job parameter
- Key: --conf
- Value: spark.sql.catalog.<catalog_name>=org.apache.iceberg.spark.SparkCatalog --conf spark.sql.catalog.<catalog_name>.warehouse=<S3_PATH> --conf spark.sql.catalog.<catalog_name>.catalog-impl=org.apache.iceberg.aws.glue.GlueCatalog --conf spark.sql.catalog.<catalog_name>.io-impl=org.apache.iceberg.aws.s3.S3FileIO --conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions
Set on the Glue job script:
from pyspark.sql import SparkSession from pyspark.context import SparkContext
spark = SparkSession.builder \ .config(f"spark.sql.catalog.<catalog>", "org.apache.iceberg.spark.SparkCatalog") \ .config(f"spark.sql.catalog.<catalog>.warehouse", "s3://bucket/path/") \ .config(f"spark.sql.catalog.<catalog>.catalog-impl", "org.apache.iceberg.aws.glue.GlueCatalog") \ .config(f"spark.sql.catalog.<catalog>.io-impl", "org.apache.iceberg.aws.s3.S3FileIO") \ .config(f"spark.sql.extensions","org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions") \ .getOrCreate()
Connector options
You can pass the following options to the connector.
- path - An Iceberg table name such as <catalog_name>.<database>.<table>.
- as-timestamp-of or snapshotid (optional, set only for read) - You can operate "Time travel" queries by setting this parameter. Please see the document about the time-travel.
NOTE: Glue natively supports Iceberg. Please refer to https://docs.aws.amazon.com/glue/latest/dg/aws-glue-programming-etl-format-iceberg.html .
Resources
Support
Vendor support
Please allow 24 hours
AWS infrastructure support
AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.