top of page

Big Data Engineer


Apply for this job
Upload Resume

LiberaTrade is redefining the supply chain by combining AI with the real world experience. They are looking for a Big Data Engineer to help them supercharge their work by making data and data tools useful, accessible and reliable.
Your work will revolve around collecting, storing, processing, and analyzing huge sets of data to help LiberaTrade’s solutions get smarter.

In this role, you will:

  • Choose optimal solutions for collecting, storing, processing and analysing huge sets of data and integrate them within the product architecture.

  • Integrate data from disparate sources and conduct pre processing service

  • Monitor performance and tweak any necessary infrastructure changes

  • Build internal tooling, applications and libraries

  • Deploy and maintain machine learning models

  • Streamline data pipelines

  • Design solutions independently based on high level architecture

  • Maintain the production system (from end-to-end)

  • Collaborate with other team members from development and research

  • Implement cloud migration strategy

  • Collaborate with cross-functional teams, such as DataOps, InfoSecOps

The ideal candidate will:

Have the following knowledge and experience:

  • In depth understanding of distributed computing principles and the data engineering life cycle (from data ingestion to visualization and representation)

  • Proficiency in varied data storage mechanism and frameworks - both data in-motion and at-rest

  • Proficiency in Big Data tools and frameworks required to provide requested capabilities

  • Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming

  • Experience with integration of data from multiple data sources

  • A good understanding of various development & integration techniques

  • A very good understanding of various data structures, OOPs, algorithms

  • Proficiency in defining data retention policies

  • Proficiency in data governance and management

Be proficient in the following tools and technologies:

  • Hadoop v2, MapReduce, HDFS

  • Big Data querying tools such as Pig, Hive, and Impala

  • Data processing framework Spark

  • NoSQL databases such as HBase, Cassandra, MongoDB

  • Various ETL techniques and frameworks such as Flume

  • Various messaging systems such as Kafka or RabbitMQ

  • Big Data ML toolkits, such as Mahout, SparkML, or H2O

  • Lambda Architecture and familiarity with its advantages and drawbacks

  • Data governance tools, such as Reiid

  • Microservice architecture

  • (Some exposure of) Docker, Virtualization, Container is desirable

  • GitHub or GitLab

  • Data representation and visualization tools such as ReportServer, Pentaho

  • Underlying fundamentals and limitations of C/C++, Python, R, Scala, Julia

Have the following qualifications:

  • Minimum Bachelor degree in Computer Sciences, Information Technology or Software Engineering

  • A degree in Math, Physics or Statistics may be considered if advanced certification in data engineering is supplemented with

  • Certain certification may give an edge though not absolutely required: AWS, Azure, GCP, OpenStack

About the employer:

Liberatrade uses the power of data to stimulate demand and grow sales, while sharing these demand insights with suppliers and logistics partners to make sure required inventory is able to be sold and convert to cash. They then use their understanding of risk to connect providers of capital with SMEs and others who need it, to provide the liquidity needed for today's dynamic supply chains.



Other benefits:

PKR 100,000 - 150,000

Benefits are currently in the process of being set up

bottom of page