Description
We are working alongside our pioneering client in the Ag-Tech sector, who are looking to hire a Data Engineer, with experience in Python scripting, AWS technologies, SQL, geospatial data processing, and BI tools.
Who we are
We are a dynamic recruitment agency that has evolved since our first establishment in 2012. We work with hundreds of high-profile clients globally to help them source and retain top-tier talent.
The client
Our client is an established Ag-Tech company, based in Illinois, whom provide growers high-level data and insights which allow them to increase crop yield, sustainability and reduce their costs.
Location: Illinois / Indiana Preferred
The Role
The primary focus is to work with agricultural data to provide analytics and insights that support our clients sales team and product users.
Requirements\Qualifications
- Bachelor’s degree in computer science, Data Science, Engineering, or related field.
- Proven experience as a Data Engineer or BI Analyst.
- Hands-on experience with Python scripting and ETL processes.
- Strong proficiency in SQL (PostgreSQL) and working with Amazon Redshift.
- Proficient with geospatial data processing using Python packages like rasterio and Geopandas.
- Proficient with Looker and optimising BI tool performance.
- Familiarity with Agile methodologies and collaborative coding practices.
- Excellent problem-solving abilities.
- Strong communication skills.
- Ability to translate business needs into technical solutions.
Responsibilities
Data Pipeline Automation:
- Write scripts to streamline data transfer between Amazon S3 and PostgreSQL databases.
- Manage and optimise data warehousing systems using PostgreSQL and Amazon Redshift for efficient data handling.
Database Optimisation:
- Develop and enhance database queries to improve performance and functionality.
- Create materialised views and recommend structural optimizations to accelerate data analysis.
Geospatial Data Handling:
- Use Python libraries like rasterio and geopandas to process and analyse spatial datasets.
- Integrate geospatial information into broader analytical workflows to enhance decision-making.
Business Intelligence and Reporting:
- Build dashboards and analytical reports in Looker to visualise key metrics.
- Optimise Looker performance to support real-time and large-scale data analysis.
Agricultural Data Insights:
- Analyse datasets to uncover insights into crop performance, including vegetation indices, weather trends, and input applications like nitrogen.
- Collaborate with commercial teams to address challenges using data-driven strategies.
Collaboration and Development Practices:
- Participate in Agile workflows to prioritise and execute tasks effectively.
- Use Git for version control and AWS tools for building scalable, cloud-based solutions.