AI - Parquet
Provides efficient columnar storage & simplified processing for large datasets via open-source Parquet tech. Committed to coding standards, community collaboration, and Apache licensing.
- Name
- Parquet - https://github.com/apache/parquet-mr
- Last Audited At
About Parquet
Parquet is an open-source data storage technology developed under the Apache Software Foundation. It focuses on delivering efficient, columnar storage solutions for large datasets. Parquet's primary goal is to make data analysis faster by providing optimized I/O and compression.
Parquet offers services that simplify data processing when using platforms like Hadoop or Spark. The company prides itself on strict adherence to coding standards, ensuring readability and maintainability of the codebase. Their key offerings include:
- Pig compatibility: Parquet's implementation of Pig Latin, the programming language for defining data flow graphs, allows for seamless integration with tools like Apache Pig and Hive.
- Schema conversion: Parquet offers automatic schema conversion for certain data storage methods to make the process easier for developers.
- Contributor community: The project boasts a dedicated community of authors and contributors that collaborate on enhancing and expanding Parquet's capabilities.
- Code of conduct: Parquet adheres to two codes of conduct - Apache Software Foundation's and Twitter's, ensuring a positive and inclusive development environment.
- Licensing: Parquet is licensed under the Apache License, Version 2.0.
To contribute to Parquet, one can send pull requests against the official Git repository at https://github.com/apache/parquet-mr. The project encourages contributions in various forms, including code improvements and issue reporting.