pracaon.pl

Senior Big Data Developer

Remote, Polska
Ogłoszenie zewnętrzne
EPAM

EPAM

Partner
39d
Wynagrodzenie do ustalenia
IT i Telekomunikacja
Pełny etat
Zdalna
Wymagania
  • Bachelor’s degree in Computer Science, Mathematics or related technical field

  • 5+ years experience in software development with Big Data technologies

  • Advanced knowledge of Python, Scala, or Java

  • Proficient with Spark and Databricks

  • Experience with cloud platforms such as Azure, Google Cloud Platform, Amazon Web Services

  • Demonstrated experience with big data models, databases, and tools

  • Strong understanding of different data file formats like JSON, Parquet, Avro

  • Hands-on experience with SQL and NoSQL databases, including Cassandra and MongoDB

  • Proficiency in data warehousing solutions, ETL processes

  • Proven ability to work within a fast-paced, team-oriented environment

  • Detail-oriented with excellent analytical and problem-solving skills

  • Knowledge of software development methodologies and best practices

  • Exceptional oral and written communication skills in English (B2+)

Zakres obowiązków
  • Implement data ingestion from diverse sources utilizing technologies including RDBMS, REST HTTP API, flat files, Streams, Time series data, SAP

  • Research and utilize Big Data technologies for effective data ingestion

  • Process and transform data using tools such as Spark and various Cloud Services

  • Understand and execute project-specific business logic through programming languages supported by the primary data platform

  • Optimize data retrieval, develop dashboards, and perform data validation

  • Collaborate with diverse teams including data scientists, analysts, and IT professionals

  • Provide thought leadership on Big Data best practices and technology trends

  • Maintain and ensure integrity and security of big data sources and platforms

  • Perform troubleshooting, debugging, and upgrading of big data systems and applications

Seniority
  • Senior

Mile widziane
  • Experience with Hadoop, Spark, Kafka, Oozie, Airflow, HDFS, YARN

  • Experience with message queues, especially Kafka

  • Production experience in search technologies such as Solr, OpenSearch, or Lucene

Opis

We are looking for a Senior Big Data Developer to join our growing Data Practice and make our team even stronger. Our projects and technologies are very diverse and cover all technologies currently on the market and represented by open-source communities. We are providing our service to Clients in different domains: Financial, Health Care, Insurance and many others, so you will have a chance to develop yourself in any direction you want. Responsibilities Implement data ingestion from diverse sources utilizing technologies including RDBMS, REST HTTP API, flat files, Streams, Time series data, SAP Research and utilize Big Data technologies for effective data ingestion Process and transform data using tools such as Spark and various Cloud Services Understand and execute project-specific business logic through programming languages supported by the primary data platform Optimize data retrieval, develop dashboards, and perform data validation Collaborate with diverse teams including data scientists, analysts, and IT professionals Provide thought leadership on Big Data best practices and technology trends Maintain and ensure integrity and security of big data sources and platforms Perform troubleshooting, debugging, and upgrading of big data systems and applications Requirements Bachelor’s degree in Computer Science, Mathematics or related technical field 5+ years experience in software development with Big Data technologies Advanced knowledge of Python, Scala, or Java Proficient with Spark and Databricks Experience with cloud platforms such as Azure, Google Cloud Platform, Amazon Web Services Demonstrated experience with big data models, databases, and tools Strong understanding of different data file formats like JSON, Parquet, Avro Hands-on experience with SQL and NoSQL databases, including Cassandra and MongoDB Proficiency in data warehousing solutions, ETL processes Proven ability to work within a fast-paced, team-oriented environment Detail-oriented with excellent analytical and problem-solving skills Knowledge of software development methodologies and best practices Exceptional oral and written communication skills in English (B2+) Nice to have Experience with Hadoop, Spark, Kafka, Oozie, Airflow, HDFS, YARN Experience with message queues, especially Kafka Production experience in search technologies such as Solr, OpenSearch, or Lucene

Słowa kluczowe / Umiejętności
Data Software Engineering
Oferta została zaimportowana z zewnętrznego portalu.Źródło ogłoszenia