Data
Roles:
Develop high performance and data intensive backend systems.
Minimum qualifications:
5+ years experience in python or other scripting languages.
5+ years experience in at least one static typed language like golang, java, scala, or C++.
Experience working with relational databases like MySQL & data warehouses like snowflake & distributed file systems like HDFS.
Experience in developing clean and maintainable streaming and ETL pipelines.
Experience with cloud environments like AWS, Lambda, Serverless or Azure/GCP, Cloud Functions ... etc.
Preferred Qualifications:
An expert in data modeling for relational databases like MySQL and columnar formats like parquet.
Expert in streaming technologies like kinesis, kafka, flink ... etc.
Expert in scaling large databases and data intensive applications.
Experience with CI/CD technologies like terraform, GHA and docker … etc.
Experience with MLOps, including data ingestion, cleaning, training, tuning, serving, monitoring, scaling ... etc.
Deep understanding of MySQL, Snowflake, ElasticSearch, Dynamo and other databases.
Expert in designing fault-tolerant systems.