Data Warehouse Architect
Lahore, Punjab, Pakistan
Full Time
BIU Department
Experienced
Job Title: Data Warehouse Architect
Location: Lahore
Position Type: Full-Time
Job Overview:
As a Data Warehouse Architect, you will be responsible for designing, developing, and maintaining robust data infrastructure to ensure the availability, scalability, and reliability of our data systems. You will work closely with cross-functional teams, including data scientists, analysts, and software engineers, to facilitate data-driven decision-making and optimize the efficient processing of large-scale data.
Key Responsibilities:
•Design Data Pipelines: Design, develop and manage robust, scalable, and high-performance data pipelines and ETL processes to extract, transform, and load data from various sources into data warehouses or data lakes using tools like Spark, Kafka, or other cloud-native solutions.
•Real-Time Data Processing: Design, implement and optimize real-time data processing pipelines using tools such as Kafka, Spark Streaming, or Flink to handle continuous data streams effectively.
•Big Data Solutions: Design and implement big data solutions to process billions of event records, ensuring workflows are optimized for efficiency and scalability
.
•Data Modeling and Schema Design: Collaborate with stakeholders to understand data requirements and design appropriate data models and database schemas that meet business needs.
•Pipeline Optimization: Optimize data pipelines and ETL processes to enhance performance and efficiency, ensuring timely and accurate data delivery to end-users.
•Data Quality and Reliability: Monitor, troubleshoot, and resolve issues related to data quality, consistency, and integrity to ensure the reliability and accuracy of data systems.
•End-to-End Data Solutions: Architect and implement end-to-end data solutions, including data ingestion, processing, storage, and access layers, ensuring scalability and performance.
•Mentorship & Team Leadership: Mentor and guide a team of data engineers, fostering a collaborative and high-performing culture. Provide thought leadership in data engineering practices and promote knowledge-sharing initiatives.
•Data Governance: Implement and maintain data governance policies, ensuring compliance with data privacy and security regulations.
•Analytics Support: Collaborate with data scientists, business and data analysts, providing them with the necessary data infrastructure and tools to enable advanced analytics and insights generation.
•Documentation: Document data engineering processes, data flows, and system architectures to facilitate knowledge sharing and maintain up-to-date technical documentation.
•Strategic Planning & Leadership: Assist with strategic and budget planning of the unit and ensure that project timelines are managed and met.
•Cross-Functional Collaboration: Work closely with software engineers, infrastructure teams, and other cross-functional partners to optimize data infrastructure and ensure seamless integration with other systems.
•Innovation and Continuous Improvement: Stay current with industry trends and technologies in data engineering, recommending innovative solutions to improve processes and systems.
Skills and Qualifications:
•Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
•At least 5 years of proven experience in data warehouse and platform infrastructures, big data technologies, and large-scale data processing and ETL pipelines.
•Hands-on experience with cloud computing platforms like AWS, Azure, or GCP.
•Strong experience with SQL and database technologies, including relational databases, SQL queries, and data modeling.
•Proven experience with big data solutions and technologies such as Hadoop, Hive, and distributed systems.
•Experience with distributed and NoSQL database environments such as Cassandra, MongoDB, or HBase.
•Proficiency in Python, Java, or Scala, with experience in data manipulation and processing frameworks like Apache Spark.
•Familiarity with building scalable applications from the ground up, including server configuration and platform architecture.
•Knowledge of data integration and workflow management tools such as Apache Airflow, Glue, Dagster, or ADF.
•Strong analytical and problem-solving skills, with the ability to analyse complex data-related issues and propose effective solutions.
•Excellent communication and collaboration skills, with the ability to work effectively in a cross-functional team environment.
•Attention to detail and a strong commitment to delivering high-quality work within established timelines.
•Proven experience in the fintech domain is a plus.
How to Apply:
If you are a dedicated data warehouse infrastructure professional with a passion for protecting digital assets and a desire to work in a dynamic and collaborative environment, we encourage you to apply. Please submit your resume and a cover letter detailing your relevant experience to https://acemoneytransfer.applytojob.com/apply
ACE Money Transfer is an equal-opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Location: Lahore
Position Type: Full-Time
Job Overview:
As a Data Warehouse Architect, you will be responsible for designing, developing, and maintaining robust data infrastructure to ensure the availability, scalability, and reliability of our data systems. You will work closely with cross-functional teams, including data scientists, analysts, and software engineers, to facilitate data-driven decision-making and optimize the efficient processing of large-scale data.
Key Responsibilities:
•Design Data Pipelines: Design, develop and manage robust, scalable, and high-performance data pipelines and ETL processes to extract, transform, and load data from various sources into data warehouses or data lakes using tools like Spark, Kafka, or other cloud-native solutions.
•Real-Time Data Processing: Design, implement and optimize real-time data processing pipelines using tools such as Kafka, Spark Streaming, or Flink to handle continuous data streams effectively.
•Big Data Solutions: Design and implement big data solutions to process billions of event records, ensuring workflows are optimized for efficiency and scalability
.
•Data Modeling and Schema Design: Collaborate with stakeholders to understand data requirements and design appropriate data models and database schemas that meet business needs.
•Pipeline Optimization: Optimize data pipelines and ETL processes to enhance performance and efficiency, ensuring timely and accurate data delivery to end-users.
•Data Quality and Reliability: Monitor, troubleshoot, and resolve issues related to data quality, consistency, and integrity to ensure the reliability and accuracy of data systems.
•End-to-End Data Solutions: Architect and implement end-to-end data solutions, including data ingestion, processing, storage, and access layers, ensuring scalability and performance.
•Mentorship & Team Leadership: Mentor and guide a team of data engineers, fostering a collaborative and high-performing culture. Provide thought leadership in data engineering practices and promote knowledge-sharing initiatives.
•Data Governance: Implement and maintain data governance policies, ensuring compliance with data privacy and security regulations.
•Analytics Support: Collaborate with data scientists, business and data analysts, providing them with the necessary data infrastructure and tools to enable advanced analytics and insights generation.
•Documentation: Document data engineering processes, data flows, and system architectures to facilitate knowledge sharing and maintain up-to-date technical documentation.
•Strategic Planning & Leadership: Assist with strategic and budget planning of the unit and ensure that project timelines are managed and met.
•Cross-Functional Collaboration: Work closely with software engineers, infrastructure teams, and other cross-functional partners to optimize data infrastructure and ensure seamless integration with other systems.
•Innovation and Continuous Improvement: Stay current with industry trends and technologies in data engineering, recommending innovative solutions to improve processes and systems.
Skills and Qualifications:
•Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
•At least 5 years of proven experience in data warehouse and platform infrastructures, big data technologies, and large-scale data processing and ETL pipelines.
•Hands-on experience with cloud computing platforms like AWS, Azure, or GCP.
•Strong experience with SQL and database technologies, including relational databases, SQL queries, and data modeling.
•Proven experience with big data solutions and technologies such as Hadoop, Hive, and distributed systems.
•Experience with distributed and NoSQL database environments such as Cassandra, MongoDB, or HBase.
•Proficiency in Python, Java, or Scala, with experience in data manipulation and processing frameworks like Apache Spark.
•Familiarity with building scalable applications from the ground up, including server configuration and platform architecture.
•Knowledge of data integration and workflow management tools such as Apache Airflow, Glue, Dagster, or ADF.
•Strong analytical and problem-solving skills, with the ability to analyse complex data-related issues and propose effective solutions.
•Excellent communication and collaboration skills, with the ability to work effectively in a cross-functional team environment.
•Attention to detail and a strong commitment to delivering high-quality work within established timelines.
•Proven experience in the fintech domain is a plus.
How to Apply:
If you are a dedicated data warehouse infrastructure professional with a passion for protecting digital assets and a desire to work in a dynamic and collaborative environment, we encourage you to apply. Please submit your resume and a cover letter detailing your relevant experience to https://acemoneytransfer.applytojob.com/apply
ACE Money Transfer is an equal-opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Apply for this position
Required*