You are on page 1of 8

SRAVAN KUMAR

Email: sravankumar.hadoop09@gmail.com
Hadoop Developer
Contact:510-509-6104

Professional Summary

Around 9 years of expertise in the various product application development, Agile,


systems programming, requirements gathering, technical documentation writing, Big
Data solutions design and development of a different enterprise application systems,
extensive design and implementation exposure practices across various domains
4 plus years of experience with full project lifecycle development in J2EE
technologies including Software Design, Analysis, Coding, Development, Testing,
Implementation, Maintenance and Support
Sound knowledge of Java, J2EE, & its related technology components
Around 4 years of experiences in Hadoop Eco-system components HDFS, map-reduce
(MRV1, YARN), Pig, Hive, Hbase, Sqoop, Flume, Impala, Oozie, Zookeeper, Strom, Tez,
Cassandra, and Programming in spark using Scala
Having sound knowledge in Java and J2EE technologies.
Hands on experience with the Hadoop 1.0, Hadoop YARN, Apache Storm, Apache
Flume, HDFS, Apache Ambari

Expertise Knowledge on RabbitMQ, Apache Kafka.

Extensive Knowledge on Apache Avro, Protocol Buffers data serialization.

Hands on knowledge Time series database Influx DB.

Extensive knowledge on MongoDB (NoSQL).

Experience in developing Knowledge Contract-First and Contract-Last Web Services


(SOA).

In depth knowledge in Spring WS Frame Work 2.0.

Hands on Experience in the Adobe Flex.

Proficient knowledge in JDBC, PL/SQL and stored procedure.

Expertise in implementing Struts , JBoss Seam 2.1, Hibernate 3x and Spring 2x


Frameworks for Enterprise projects

Experience in implementing the web services in REST Style architecture.

Having Sound Knowledge in Oracle AS 10g,JBoss4.3, Apache Tomcat 6x, Apache Axis
2, WebLogic 9x/8x.
Extensive Experience in developing custom widgets using Google Web Toolkit (GWT)
and Client-Server Communication.
Very good understanding of xml technologies SAX, DOM, W3C XML Schema, JAXB 2.0
and XML Beans.
Expertise in developing database application using Oracle, MySQL, SQL Server 2005.
Experience in Agile development methods like Iterations, scrum meetings, Test Driven
Development, Pair programming, code refactoring.

Sound knowledge in Version One agile project management and tracking tool

Excellent knowledge in OOPS methodologies for extensive product development in the


areas of Business application and RDBMS concepts

Extensive experience in all phase of the project life cycle including conceptual,
functional and technical design as well as application programming

Having sound Knowledge in UML diagram and Rational Rose.

Excellent working knowledge in JavaScript and Ajax.

Experience in coding ANT build scripts for application deployments.


Extensive Knowledge in Maven for build and managing Java projects.
Experience in Design and Development of User Interfaces using HTML, DHTML.
Exposure in CMM Level 5 processes work for project deliverables.
Good Domain experience in ecommerce, Mutual Fund (Finance), Retail & web
Applications.
Strong analytical and problem solving skills. Willingness and ability to adapt to new
environments and learn web technologies.
Developed simple to complex Map/Reduce jobs using Hive and Pig to handle files in
multiple formats (JSON, Text, XML, Avro, Sequence File and etc..)
worked extensively on creating combiners, Partitioning, Distributed cache to improve
the performance of MapReduce jobs
Experience in working with different data sources like Flat files, XML files, log files
and Databases
Good knowledge on Apex
Experience in developing Maven and ANT scripts to automate the compilation,
deployment and testing of web application
Primary task to lead and guide client team in building streaming applications using Java
and can share best practices and patterns

Technical Skills
Hadoop Ecosystems
Data Discovery Applications

HDFS, MapReduce, YARN, Pig, Hive, Hbase, Sqoop, Flume, Oozie, zookeeper,
Spark, Scala, Kafka and storm
Elastic Search and Kibana

NoSQL Databases

Hbase and MongoDB

Hadoop Distributions

CDH and Hortonworks

Build Management Tools

Maven, Apache Ant, GitHub and Version Control SVN

Programming Languages

Java, C , C++, Data Structures,R, Scala and Python

Java Technologies

JDBC, NetBeans, Servlets, JSP and AJAX

Web User Interfaces

HTML, DHTML, JavaSript, and XML

Web Servers

Apache Tomcat

Web Services

REST and SOAP

Databases

Oracle 11g, MySQL and Teradata

Development Tools

Eclipse, Putty and Tectia

Scripting Languages

Shell Script, Perl, Ant Script and Unix Scripting

Professional Experience
Hadoop Developer
Toyota, Georgetown, KY
Roles and Responsibilities

Apr2015-Present

Developed the Apache Storm (Topologies), Kafka, and HDFS integration project to do a
real time data analyses
Designed and developed the Apache Storm topologies for Inbound and outbound data
for real time ETL to find the latest trends and keywords
Developed the components RabbitMQ to Flume, Kafaka to Flume, Storm-to-Flume,
RabbitMQ-to-Kafka, RabbitMQ-to-RabbitMQ components
Designed the Apache Avro schema for the data serialization for the HDFS.
Developed the Kafka source for the Apache Flume to load the bulk data (100 millions)
from the files to HDFS.
Migrated the HBase data from the different clusters
Used the Hive to analyse the partitioned data
Involved in the design & development of Customer and Provider portals
Developed the GWT web components (High charts) for Portal dashboard
Developed the SOA services and client binding for the GWT.
Creating the REST API for the time series using Drop wizard framework

Environment:
Apache Storm 0.9, Hadoop YARN, HBase, Apache Kafka, Apache Flume, Drop wizard ,HIVE,
SOA,
GWT, JDK 1.5, RabbitMQ 3.5.3, MongoDB, Apache Avro, Proto Buffer, Hortonworks
Data platform, Apache Ambari, Influx DB, Oozie workflow, Zookeeper

Hadoop Developer
General Motors, Detroit, MI

July2014-Mar2015

Roles and Responsibilities

Written multiple java MapReduce jobs for data cleaning and preprocessing
Experienced in defining job flows using Oozie
Experienced in managing and reviewing Hadoop log files
Load and transform large sets of structured, semi structured and unstructured data
Responsible to manage data coming from different sources and application
Supported Map Reduce Programs those are running on the cluster
Involved in loading data from UNIX file system to HDFS.
Involved in designing schema, writing CQLs and loading data using Cassandra
Good experience with CQL Data manipulation commands and CQL clauses
Worked with CQL collections
Installed and configured Hive and also written Hive UDFs.
Involved and experienced with Datastax
Involved in creating Hive tables, loading with data and writing hive queries which will
run internally in map reduce way
Developed Map Reduce jobs to automate transfer of data from/to HBase
Assisted with the addition of Hadoop processing to the IT infrastructure
(Near) Real time search.
Working in creation of indexes and increasing the search results very faster
Used RESTful JAVA APIs and Native APIs while working with Elastic Search
Used flume to collect the entire web log from the online ad-servers and push into
HDFS

Implemented Map/Reduce job and execute the MapReduce job to process the log data
from the ad-servers
Wrote efficient map reduce code to aggregate the log data from the Ad-server
Used Hive to analyse the partitioned and bucketed data and compute various metrics
for reporting
Working knowledge in writing PIG's Load and Store functions

Environment: Hortonworks, MapReduce, HDFS, Hive, Pig, Flume, Oozie, Cassandra, Java
1.5, Elastic Search and Kibana

Hadoop Developer
Jackson, Detroit, MI
Jan2013-Apr2014
Roles and Responsibilities

Gathered the business requirements from the Business Partners and Subject Matter
Experts

Responsible to manage data coming from different sources

Involved in HDFS maintenance and loading of structured and unstructured data

Wrote MapReduce job using Java API

Involved in managing and reviewing Hadoop log files

Imported data using Sqoop to load data from MySQL to HDFS on regular basis

Developing Scripts and Batch Job to schedule various Hadoop Program

Written Hive queries for data analysis to meet the business requirements

Creating Hive tables and working on them using Hive QL.

Utilized Agile Scrum Methodology to help manage and organize a team of 4 developers
with regular code review sessions

Weekly meetings with technical collaborators and active participation in code review
sessions with senior and junior developers

Used JUnit for unit testing and Continuum for integration testing

Environment: Hadoop, MapReduce, HDFS, Hive, PIG, MySQL, Java (jdk1.6), and Junit

Sr. Java Developer / Hadoop Developer


Blue Cross Blue Shield, Chattanooga, TN
May2011Nov2012
Roles and Responsibilities

Involved in coding of JSP pages for the presentation of data on the View layer in MVC
architecture
Involved in requirements gathering, analysis and development of the Insurance
Portal application

Used J2EE design patterns like Factory Methods, MVC, and Singleton Pattern that
made modules and
code more organized, flexible and readable for future upgrades
Worked with JavaScript to perform client side form validations
Used Struts tag libraries as well as Struts tile framework
Used JDBC to access Database with Oracle thin driver of Type-3 for application
optimization and
efficiency
Used Data Access Object to make application more flexible to future and legacy
databases
Actively involved in tuning SQL queries for better performance
Wrote generic functions to call Oracle stored procedures, triggers, functions
Used JUnit for the testing the application in the testing servers
Providing support for System Integration Testing & User Acceptance Testing
Used Oracle SQL developer for the writing queries or procedures in SQL.
Involved in resolving the issues routed through trouble tickets from production floor
Participated in Technical / Functional Reviews
Involved in Performance Tuning of the application
Used Log4J for extensible logging, debugging and error tracing
Need to discuss with the client and the project manager regarding the new
developments and the errors
Involved in Production Support and Maintenance
Involved in transferring data from MYSQL to HDFS using Sqoop

Written map-reduce jobs according to the analytical requirements


I developed java programs to clean the huge datasets and for pre processing
Responsible in creating Pig scripts and analyzing from the large datasets
Involved with different kind of files such as text and xml data
Involved in developing of UDFs in pig scripts
Interacted and reported the fetched results to BI department

Environment: JDK, J2EE, UML, Servlet, JSP, JDBC, Struts, XHTML, JavaScript, MVC, XML,
XML, Schema, Tomcat, Eclipse, CDH, Hadoop, HDFS, Pig, MYSQL and MapReduce

Java Developer
US Steel, Pittsburgh, PA
Oct2010 Feb2011
Roles and Responsibilities

Involved in requirement gathering & Analysis of the project


Designed the functional specifications and architecture of the web-based module
using Java Technologies
Created Design specification using UML Class Diagrams, Sequence & Activity Diagrams
Developed the Web Application using MVC Architecture, Java, JSP, and Servlets &
Oracle Database
Developed various Java classes, SQL queries and procedures to retrieve and

manipulate the data from backend Oracle database using JDBC.


Extensively worked with Java Script for front-end validations
Analysis of business requirements and develop system architecture document for the
enhancement project
Designed and developed applications on Service Oriented Architecture (SOA)
Created UML (Use cases, Class diagrams, Activity diagrams, Component diagrams, etc.)
using Visio
Provided Impact Analysis and Test cases
Delivered the code within the timeline, and logged the bugs/fixes in TechOnline,
tracking system
I had developed Unit & Functional Test cases for testing Web Application
Used Spring (MVC) architecture to implement the application using the concrete
principles laid down by several design patterns such as Composite View, Session
Facade, Business Delegate, Bean Factory, Singleton, Data Access Object and Service
Locator
Involved in the integration of spring for implementing Dependency Injection
Developed code for obtaining bean references in Spring IOC framework
Focused primarily on the MVC components such as Dispatcher Servlets, Controllers,
Model and View Objects, View Resolver
Involved in creating the Hibernate POJO Objects and utilizing Hibernate Annotations
Used Hibernate, object/relational-mapping (ORM) solution, technique of mapping data

Environment: Windows NT 2000/2003, XP, and Windows 7/ 8, Java, UNIX, SQL, SOA, JDBC,
JavaScript, Maven, JUnit, Agile/Scrum Methodology and SVN Version Control

JAVA Developer
Ecolite Technologies, India
Dec2009 Aug2010
Roles and Responsibilities

Developed web components using JSP, Servlets and JDBC


Designed tables and indexes
Designed, Implemented, Tested and Deployed Enterprise Java Beans both Session and
Entity using WebLogic as Application Server
Developed stored procedures, packages and database triggers to enforce data
integrity. Performed data analysis and created crystal reports for user requirements
Implemented the presentation layer with HTML, XHTML and JavaScript
Used EJBs to develop business logic and coded reusable components in Java Beans
Development of database interaction code to JDBC API making extensive use of SQL
Query Statements and advanced Prepared Statements
Used connection pooling for best optimization using JDBC interface
Used EJB entity and session beans to implement business logic and session handling and
transactions Developed user-interface using JSP, Servlets, and JavaScript
Wrote complex SQL queries and stored procedures
Actively involved in the system testing
Prepared the Installation, Customer guide and Configuration document which were
delivered to the customer along with the product
Responsible for creating work model using HTML and JavaScript to understand the flow
of the web application and created class diagrams.
Participated in the daily stand up SCRUM agile meetings as part of AGILE process for

reporting the day to day developments of the work done


Design and develop user interfaces using HTML, JSP.
J2EE is used to develop the application based on MVC architecture
Created interactive front-end GUI using JavaScript, JQuery, DHTML and Ajax
Used SAX and DOM XML parsers for data retrieval

Environment: Windows NT 2000/2003, XP, and Windows 7/ 8 C, Java, JSP, Servlets, JDBC,
EJB, DOM, XML, SAX

Java Developer
Prokarma , India
Aug2007 Oct2009
Roles and Responsibilities

Successfully completed the Architecture, Detailed Design & Development of modules


Interacted with end users to gather, analyze, and implement the project
Developed applications that enable the public to review the Inventory Management
Established schedule and resource requirements by planning, analyzing and
documenting development effort to include time lines, risks, test requirements and
performance targets
Analyzing System Requirements and preparing System Design document
Developing dynamic User Interface with DHTML and JavaScript using JSP and Servlet
Technology
Designed and developed a sub system where Java Messaging Service (JMS) applications
are developed to communicate with MQ in data exchange between different systems
Java Message Oriented Middleware (MOM) API for sending messages between clients
Used JMS elements for sending and receiving messages
Used hibernate for mapping from Java classes to database tables
Wrote PL/SQL & SQL in Oracle Database for creating tables, indexes, triggers and query
statements
Design and develop enterprise web applications, for internal production support
group, using Java (J2EE), Design Patterns and Struts framework
Tuning and Index creation for improved performance
Designed and developed database schema for new applications
Created connection pooling method to avoid the waiting for database connection
Designed an ER Diagram for all the databases using the DB Designer an Open Source
Tool
Designed the Class Diagrams and the Use cases Diagram using the Open Source tool
Created and executed Test Plans using Quality Center by Test Director
Mapped requirements with the Test cases in the Quality Center
Supporting System Test and User acceptance test
Created internal users and support personnel by drafting and reviewing company
product documentation such as Technical document, Impact assessment document

Environment: Rational Application Developer 6.0, Rational Rose, Java, J2ee, JDBC, EJB,
JSP, EL, JSTL, JUNIT, XML, SOAP, WSDL, SOA.

References

Will be available upon request

You might also like