We have used BetterGulfCoastJobs for years and it continues to provide the best local candidates. - Tony

Job Details

Java Tech lead

  2026-03-31     Acestack LLC     all cities,AK  
Description:

ROLE: Java Tech lead

Location: Phoenix, Raleigh, Remote
Duration: 6 months


Experience requested: 8 to 10 Yrs

Description:
• Strong NiFi Expertise: Deep understanding of Apache NiFi architecture, processors, controllers, and data flow management capabilities.
• Programming Languages: Proficiency in Java, Spring boot & API Microservices.
• Database and SQL: Familiarity with databases, SQL, and data manipulation techniques.
• Big Data Technologies: Kafka ( Event driven & streaming architecture)
• Apachae Airflow - Workflow orchestration"

Design and Implementation:
o Creating and implementing data flow solutions using NiFi, including defining data sources, destinations, and transformation processes.
• Ingestion and Transformation:
o Managing data ingestion from various sources (e.g., files, databases, APIs) and transforming data into desired formats for downstream consumption.
• Routing and Enrichment:
o Configuring NiFi to route data to appropriate destinations and enriching data with additional information (e.g., timestamps, location).
• Scalability and Performance:
o Ensuring data flows are scalable and performant to handle varying data volumes and velocity.
• NiFi Cluster Management:
o Maintaining and improving NiFi clusters, including configuration, optimization, and troubleshooting.
• Microservices Integration:
o Developing and integrating microservices to support data conditioning, format validation, and transformation processes.
• Troubleshooting and Issue Resolution:
o Identifying and resolving issues related to data flows, NiFi configurations, and data integration processes
• Airflow Expertise:
o Possessing deep knowledge of Airflow's architecture, including schedulers, executors (Celery, Kubernetes), and plugin development.
• Workflow Design and Development:
o Designing and developing complex, modular, and reusable DAGs (Directed Acyclic Graphs) to automate data pipelines.
• Performance Optimization:
o Identifying and addressing performance bottlenecks in Airflow environments and implementing best practices for orchestration and scheduling


Apply for this Job

Please use the APPLY HERE link below to view additional details and application instructions.

Apply Here

Back to Search