Website GeekPSoc IT Services

Fulfilling IT Needs

About Us:

We are an enterprise-focused platform that enables businesses to hire, assess, and manage highly skilled resources for gig-based projects. We offer experts a gateway to build freelance and consulting careers with large-scale enterprises. As a newly launched division of a pioneer in the Gig Economy in India, we aim to revolutionize the way the world works and uplift millions of careers.

About the Role:

– Location: Remote (Hybrid in Bangalore starting October 2024)

– Timings: Full Time (As per company timings)

– Notice Period: Immediate joiners or within 15 days

– Experience: 4-6 Years


– Develop and maintain data pipelines, ensuring secure transfer of data into our Big Data financial crime solution.

– Implement and standardize ETL processes for data transformation and cleansing.

– Design and implement data models and schemas for analytical applications and data visualization tools.

– Monitor and resolve issues with our Big Data financial crime solution.

– Implement security and data privacy measures to protect sensitive data.

– Develop and maintain documentation and training materials for end-users and stakeholders.

– Stay updated with emerging trends and technologies in Big Data, data analytics, and data engineering.


– Bachelor’s degree in Computer Science, Engineering, or a related field.

– 4+ years of experience in designing, configuring, and implementing Big Data solutions, preferably with financial crime solutions.

– Strong programming skills in Java, Python, or Scala.

– Proficiency in data modeling, data warehousing, and ETL processes.

– Expertise in Big Data technologies and architectures, including Hadoop, Spark, and NoSQL databases.

– Excellent analytical and problem-solving skills.

– Strong communication and interpersonal skills.

– Ability to work in a fast-paced, team-oriented environment.

– Ability to manage multiple priorities and deadlines.

– Experience with AWS is a plus.

Compensation & Job Perks:

– Opportunity to build AI and Analytics products for global markets.

– Chance to innovate, learn, and grow.

– Attractive variable compensation package.

– Flexible working hours focused on results.

– Work with an award-winning organization in the exciting fields of artificial intelligence and advanced machine learning.

Important Note:

We are seeking a Data Engineer with experience in Python, PySpark, Elasticsearch, MySQL, ETL, and Unix. This role is not for cloud-native data engineers but for those experienced in open-source technologies.

To apply for this job email your details to