Solutions Architect

Offer by Splice Machine

oracle

hadoop

java

About this job

Job type: Full-time
Role: Database Administrator



Technologies

oracle, hadoop, java



Job description

Splice Machine, an AI predictive platform startup company, is looking for a Solutions Architect with experience working with complex distributed systems and large data sets using Spark and Hadoop.  Work from anywhere in the US.

Splice Machine’s predictive platform solution helps companies turn their Big Data into actionable business decisions. Our predictive platform eliminates the complexity of integrating multiple compute engines and databases necessary to power next-generation enterprise predictive AI and Machine Learning applications.

Some of our use-cases include:

  • At a leading credit card company, Splice Machine powers a customer service application that returns sub-20ms record lookups on 7 PB of data
  • At a Fortune 50 bank, Splice Machine is replacing a leading RDBMS and data warehouse with one platform in a customer profitability application
  • At an RFID tag company, Splice Machine is replacing a complex architecture for a retail IoT solution
  • At a leading financial service company, Splice Machine powers an enterprise data hub for 10,000 users
  • At a leading healthcare solution provider, Splice Machine powers a predictive application to learn models and use them to save lives in hospitals

Splice Machine’s CEO/ Co-Founder, Monte Zweben, is a serial entrepreneur in AI, selling his first company, Red Pepper, to Peoplesoft/ Oracle for $225M and taking his second company Blue Martini, through one of the largest IPOs in the early 2000s ($2.9B). He started Splice Machine to disrupt the $30 billion traditional database market with the first open-source dual engine database and predictive platform to power Big Data, AI and Machine Learning applications.   

Splice Machine has recruited a team of legendary Big Data advisors including, Roger Bamford, “Father of Oracle RAC”, Michael Franklin, former Director of AMPLab at UC Berkeley, Ken Rudin, Head of Growth and Analytics for Google Search, Andy Pavlo, Assistant Professor of Computer Science at Carnegie Mellon University and Ray Lane, former COO of Oracle, to collaborate with the Splice Machine team as we blaze new trails in Big Data.

Solution Architect

About You

  • You have implemented several large (40-50 node) Hadoop projects and have demonstrated successful outcomes.
  • You take pride in working to understand, quantify and verify the business needs of customers and their specific use cases, translating these needs into big data, DB, or ML capabilities.
  • You are comfortable engaging both business and engineering leadership, team leads and individual contributors to drive successful business outcomes.
  • Your project leadership style emphasizes collaboration and follow-through.
  • You are very technical and are accustomed to working with architects, developers, project managers, and C-level experts to ensure the best implementation practices and use of the product.

About What You’ll Work On

  • Build the customer’s trust by maintaining a deep understanding of our solutions and their business.
  • Speak with customers about Splice Machine's most relevant features/functionality for their specific business needs.
  • Manage all post-sales technical activity, working on a cross-functional team of Splice Machine and customer resources for solution implementation.
  • Ensure that a plan is in place for each customer deployment, change management and adoption and communicated to all contributors.
  • Act as the voice of the customer and provide internal feedback on how Splice Machine can better serve our customers while working closely with Product and Engineering on identification and tracking of new feature and enhancement requests.
  • Help Sales identify new business opportunities within the customer in other departments.
  • Increase customer retention and renewals by conducting regular check-in calls and perform quarterly business reviews that drive renewals, upsells, adoption and customer references.

Requirements

  • Expertise in Cloudera and/or Hortonworks Hadoop solutions.
  • 7+ years of experience in architecting complex database and big data solutions for enterprise software.
  • Experience working in a complex multi-functional environment
  • Hands-on experience with SQL, Java and tuning databases
  • Experience with scalable and highly available distributed systems
  • BS in Computer Science / B.A. or equivalent work experience

Our people enjoy access to the best tools available, an open and collaborative work environment and a supportive culture inspiring them to do their very best.  We offer great salaries, generous equity, employee & family health coverage, flexible time off, and an environment that gives you the flexibility to seize moments of inspiration among other perks.

We encourage you to learn more about working here!



A new version is available REFRESH