Software Architect - Data Integration

Offer by GeoPhy

etl

continuous-integration

python

data-integration

togaf

About this job

Location options: Visa sponsor
Job type: Full-time
Experience level: Senior, Lead
Industry: Financial Technology, Real Estate
Company size: 51-200 people
Company type: VC Funded



Technologies

etl, continuous-integration, python, data-integration, togaf



Job description

As a Geophy Software Architect - Data Integration, you will be responsible for defining componentalized, metadata-driven architecture for our Data Delivery Pipelines. The typical pipeline encompasses data extraction and ETL components as well as eventual geospatial processing elements. You will be responsible for designing a set of reusable "plug and play" components that can be shared across multiple pipelines to accelerate the pipeline delivery. As an Architect, you will be assisting the Product Owners in defining priorities, and provide technical leadership to the teams that build the data pipelines. What you'll be responsible for
  • Breaking down the existing Data Pipelines into sets of reusable, loosely coupled components
  • Defining the reusable elements
  • Designing the metadata-driven solutions for each element and defining the configuration requirements
  • Breaking down the solution into deliverables
  • Working with Product Owners to align the priorities and sequence of the delivery of various solution components
  • Providing Technical Leadership to our ETL and Data Extraction Software teams
  • Ensuring alignment between the Data Integration and other architecture domains (Semantic Architecture, Data Science solutions, Data Processing solutions)

What we're looking for

  • Proven experience in Software Architecture domain, with specific focus on Data Integration, Data Provisioning, or Data Management
  • Familiarity with Architecture Delivery Frameworks (e.g. Togaf)
  • Hands-on experience with at least one professional ETL tool (Informatica PowerCenter, IBM DataStage etc)
  • Hands-on experience with Data Quality framework components (profiling, business data quality assessment, data reliability assessment)
  • Proven experience with designing metadata-driven solutions
  • Experience with Python
  • Experience with delivery in Agile environments
  • Understanding of Data Scraping / Data Extraction technologies
  • Understanding of geospatial data
  • Proficiency in English
  • Ability to work efficiently in distributed teams
  • Strong technical mentorship skills
  • Ability to quickly acquire new technical skills

Bonus points for

  • Understanding RDF / Semantic Paradigm
  • Experience with AWS
  • Experience with infrastructure architecture topics, namely: 
  •   -data pipeline monitoring and execution status visibility
  •   -scaling of data integration components


A new version is available REFRESH