You are on page 1of 3

Shirisha Hadoop

Software Engineer - Exelocom Technology


Chennai, Tamil Nadu - Email me on Indeed: indeed.com/r/Shirisha-Hadoop/e3fe4297dee61b4e # Over 3.5 years of experience in IT Industry like HADOOP, MICORSTRATEGY AND INFORMATICA. # 1.8 years of experience in big data technologies Hadoop ( HDFS, Map Reduce) and HADOOP ECHO SYSTEMS LIKE PIG, HIVE, SCQOOP and so on # Good exposure on Query Programming Model of Hadoop such as Pig and Hive. # Exposure on Distributed databases like HBase, MongoDB. # Created and modified many scripts of PIG according to thebusiness demand with the given business # Having knowledge on HDFS, Name Node, Data Node, Job Tracker, and Task Tracker and there working functionality # Strong experience in MicroStrategy reporting suite (Desktop, Intelligence Server, Narrowcast Server, Command Manager, Integrity Manager) # Expertise in creating and integrating Schema objects and Public Objects (Attributes, Filters, Metrics, Facts, and Promptsetc..) in order to develop reports. # Extensive experience in MicroStrategy Reporting Services. # Worked on creating complex reports in MicroStrategy Desktop using features like Custom Groups, Consolidations, level metrics, Advanced Filters. # Strong experience in creating interactive Dashboards and Scorecards # Having knowledge in ETL data load process using Informatica. # Knowledge on designing and developing mappings from varied transformation logics like Expression, Aggregator, filter, Joiner, and re-usable transformations. # Having Solid Knowledge on SQL and DATAWAREHOUSE (OLAP) CONCEPTS. # Strongly believe on working smart rather than working hard. # Consistent high performance, Self-Motivated and Quick learner

WORK EXPERIENCE

Software Engineer
Exelocom Technology - Chennai, Tamil Nadu - November 2009 to Present

Hadoop Developer
Exelocom Technology - Chennai, Tamil Nadu - November 2009 to Present

EDUCATION

M.C.A
Anna University

ADDITIONAL INFORMATION Technical skills Languages Core JAVA, SQL Databases Oracle, Teradata, DB2 FrameWorks Hadoop HDFS, MAPREDUCE Hadoop Echo Systems Pig, Hive, Sqoop, Flume ETL Informatica

CRM Siebel Projects Involved Project #1: Client: WellCare Group, USA Duration: Feb2011 to Till date. Role: Hadoop Developer Description: WellCare Heartbeat product offers you the choice of five different levels of health insurance cover. These options range from basic cover, including in-patient and surgical care, to extensive health plans. You might even find that you're covered for a pre-existing medical condition Responsibilities: # Involved in all the phases of Software Development Life Cycle (SDLC) # Developed Proof of Concept (POC) for the project by using the tools and techniques like Apache Flume, Hadoop MapReduce, Pig, Hive and HBase. # Understanding clients bigdata related requirements and providing them with solutions in terms of technologies to be used and also implementing respective solution and testing # Responsible for uploading dataset into Hadoop-0.22 cluster (20 node) # Involved in Writing Map Reduce code to find data for different requirement # Loaded the dataset into HIVE and performed BI operations such as Date # and region specific reports, overall sell reports on different combinations etc # Involved Job to Load, Loading Data into Hive. # Involved in Create the Table in HBase, Create a Transformation to Load Data into HBase. # Involved in Writing input output Formats for CSV. # Involved in Import and Export by using Sqoop for job entries Project #2: Client: Sterling bank, Texas Duration: Nov 2009 to Jan2011 Role: SE Description: Sterling bank, Texas is a banking company which offers various banking services like savings accounts, personal loans, checking accounts, Share marketing, mortage loans, credit cards and safe deposit boxes. It has 2500 branches all over the 32 states of USA. As a part of Banking Services they provide Saving Account, Checking Account, Mortgage Loan and other loans. The main reason for the Client to go for Data Warehousing is improve his business by providing various facilities to his customers. Responsibilities: # Extensively worked on Advanced Prompts, Conditional Transformation Metrics and Level Metrics to create complex reports. # Created documents with both Grids and graphs as per user requirements. # Microstrategy Architecting and Report development, Worked with source systems to understand the data flow and interacting with user groups to collect the business requirements. # Implemented star schema for data warehouse for logical/physical data modeling and dimensional modelling. # Involved in architecting Fact columns and development of Hierarchies with related Attributes required for reporting. # Created advanced grid reports, consisting of data analysis by combining a template with Filters. The endusers were able to generate ad-hoc report by drilling up, down, within and across dimension or anywhere. # Worked on Informatica - Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer. # Extensively used Transformations like Router, Aggregator, Source Qualifier, Joiner, Expression, Aggregator and Sequence generator.

# Fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data. # Scheduled Sessions and Batches on the Informatica Server using Informatica Server Manager/Workflow Manager.

You might also like