Job Description
Description
OpenCredo (OC) is a UK-based software development consultancy helping clients achieve more by leveraging modern technology and delivery approaches. We are a bunch of passionate technologists who thrive in tackling complex challenges, and delivering pragmatic and sustainable solutions for our clients. Curious, and tenacious but always sensitive to our clients’ context, we are not afraid to speak our minds to help steer our clients toward understanding and achieving their key goals.
A contract role for an advanced data and analytics architect has opened up within one of our consulting teams. This role will involve working across a diverse and varied set of data estates for a customer primarily in an assessment and consulting capacity. A key outcome will be to provide strategic and pragmatic insights regarding different platforms and estates which will be used to help shape future planning and direction. You will be working within one of our consulting teams composed of other OC technical architects and senior technologists. This role will suit someone with broad experience within data and the surrounding ecosystem. As this role requires extensive engagement with C-level executives, exceptional communication skills and strong business acumen are essential. You will be expected to bring expert advice, guidance, best practices, and recommendations in specific core technologies such as Databricks, Vertex.AI as well as a range of AI techniques (Gen AI, feature engineering of AI/ML models, MLOps).
Requirements
- Strong (& proven) experience in Databricks and GCP, in particular the Vertex AI platform, knowing best practices and able to recommend strategies for effective adoption
- Experience with MLOps, preparing training data sets, training process, model selection, and deployment
- Experience in optimizing data platforms to help accelerate and achieve strategic business goals while balancing and managing costs.
With a solid and deep understanding of:
- Data platform architecture and different use cases and topologies
- Standard technologies which can be used for target use cases (ingestion, orchestration, compute, etc.)
- Ability to help define High-level standardized processes for all parts of the data lifecycle (from acquisition and storage through to reporting and computing/analytics)
- An understanding of the modern data and analytics toolchains, from Python notebooks and data ingestion/ETL/ELT tooling along with standard/enterprise reporting tools and platforms (for example – but not restricted to – Looker Studio, Power BI, Alteryx, Tableau