Big Data Architect
- Experienced in engaging stakeholders to understand their objectives for Big Data and utilizing information gathered to plan the computing framework with appropriate hardware and software, data sources and formats, analytical tools, data storage decisions, and results consumption.
- Expertise in Big Data tools and technologies and must be experienced in the following areas: Linux expertise; experience in the set-up and administration of Hadoop platform; working knowledge of tools like Hive, Spark, HBase, Sqoop, Impala, Kafka, Flume, Oozie, MapReduce, etc.
- Required to have an understanding of scripting languages such as Java, Scala, Python, or Shell Scripting, etc.
- Practical knowledge of end-to-end design and build process of Near-Real Time and Batch Data Pipelines
- Expertise with SQL and Data modeling working in Agile development process.
- Certifications in Big Data and Cloud system architecture
- Align the organization’s big data solutions with their Client initiatives as requested
- Provide top-quality solution design and execution for all Big Data Projects
- Utilize Big Data technologies to design, develop, and evolve scalable and fault-tolerant distributed components
- Provide support in defining the scope and sizing of work
- Strong technical team leadership, mentorship and collaboration.
- Strong decision making skills in terms of data analysis and must have the ability to architect large data.
- Engage with clients to understand strategic requirements
- Responsible for translating business requirements into technology solutions
- Work with domain experts to put together a delivery plan with and stay on track
- Organize all meetings with customers and ensure prompt resolution of gaps and roadblocks
- Stay current on latest technology to ensure maximum ROI for clients
- Responsible for the design and execution of abstractions and integration patterns (APIs) to solve complex distributed computing problems.