If you enjoy working with Hadoop, are excited by doing cool things with
interesting data, and have experience working with customers, we'd like to
hear from you. For this role, experience writing Mapreduce, Hive and/or Pig
jobs for Hadoop is a must. Experience working on the internals of Hadoop is
We're looking to hire someone who will work primarily on the greater NY
metropolitan area, but will consider people elsewhere willing to travel. Our
headquarters are in the Bay Area, and everyone spends some time here.
Below is the full job description. If you are interested, please send your
resume to firstname.lastname@example.org <jobs%2Bapache@cloudera.com>, and include
"Solutions Architect" in the subject (and don't forget the +apache - it's
how we know you're already involved with the community).
Cloudera seeks a full-time Solutions Architect to provide hands-on
assistance to its customers in deploying, managing and using Hadoop for
large-scale data analysis. This person will work remotely and on-site with
customers to understand their business problems, find ways that Hadoop and
Cloudera's products can be used to solve them, and will design and build, in
collaboration with the customer, Hadoop-based analytics.
This position requires a combination of software architecture, design and
implementation skills. The Solutions Architect will not, in general, work on
Hadoop internals; rather, he or she will develop applications that run
inside the Hadoop framework to solve problems for the customer. The position
requires the ability to understand the real business problems confronting
the customer, and to apply Cloudera's products and services to them. Very
good customer-facing skills, including presentation skills and written and
verbal communication, are critical.
- Asses the technical fit of customer requirements to Cloudera products
- Explain to prospects and customers, in meetings, by email, at
conferences and elsewhere, how Hadoop and Cloudera's products work.
- Working with customers to understand business goals and technical
project requirements for POC and related projects.
- Helping customers understand how Hadoop fits with their existing
- Working with customer operations team to deploy Hadoop on a wide
variety of platforms in the cloud and local data centers
- Data processing pipeline architecture and implementation including
writing custom MapReduce, Pig and Hive jobs.
- Implementing custom connectors for existing systems, UDFs for working
with customer data, and tools to simplify recurring tasks
- Training developers, operators and analysts who will need to work with
Hadoop, Hive, Pig and other sub-projects
- Helping customers sell Hadoop internally to upper management
- The Solutions architect will REPRESENT the company personally in
customer engagements, and must be able to impart key technical, product and
company information to advance an opportunity while maintaining a
professional demeanor and positive attitude
- At least 4 years' experience in software development
- At least 2 years' experience in a customer-facing technical role,
preferably as a sales engineer or Solutions Architect
- At least 1 year experience developing applications for Hadoop
- Excellent verbal, written and interpersonal communication skills
- Strong development and debugging skills in Java, C++, C required.
Knowledge of Perl, Python, PHP, Ruby and similar scripting languages
- BS degree in computer science, MS / PhD preferred
get hadoop: cloudera.com/hadoop
online training: cloudera.com/hadoop-training