You are on page 1of 9

Job Seeker

Sr. Ab Initio Developer at Citigroup


Email me on Indeed: indeed.com/r/0d60038568fa608e
9+ years of extensive experience in the IT industry and thorough understanding of Data Ware Housing
Terminology and proficient in many Data Ware Housing Tools like AB Initio. Involved in all phases of
Data Ware Housing including ETL and Business Intelligence
Expertise in UNIX, Oracle, ETL tools, Data mining and analysis tools, Data Access methods,& Data
Modeling tools - ERWIN, 6+ years of experience in Ab Initio, SQL, PLSQL.
5+ Years of Unix, UNIX shell scripting.
Experience in design and deployment of ETL solution for large-scale data OLAP and OLTP instance
using ETL tools Ab-Initio and Oracle.
Experience working on the development of complex applications and adapt at learning and adaptable
to any kind of platform.
Designed, developed, implemented and supported small and large projects both independently and
as part of a team.
Specialization in ETL Process, Dimensional Modeling, Data Mining, Internet Programming and Client/
Server Programming.
Experience in writing Unix Shell Scripts (Korn Shell, C Shell, and Bourne Shell).
Experience in architect, design and deploy ETL solution for large-scale data OLAP and OLTP instance
using ETL tools Ab-Initio and custom scripts.
Experience with, Oracle (9i, 8i, 8.x, 7.x) Database on UNIX, windows and Linux operating system.
Coded well-tuned SQL/PL-SQL/SQL* Loader and UNIX Shell (ksh, csh) scripts for OLTP and OLAP
data warehouse instance.
Extensive experience in ETL process for ODS and DSS Data warehouse instance. Involved in
Developing Ab-Initio Graphs in a MPP (Massively Parallel Processors) Environment for the ETL process
jobs in a High Volume and dynamically growing environments.
Exceptional ability to quickly master new concepts and capable of working in-group as well as
independently with good communication skills.

Work Experience

Sr. Ab Initio Developer


Citigroup - Irving, TX
January 2011 to Present
Citigroup has a number of applications running across different platforms. Each application has
respective development and test environments. These environments have valid and real customer data.
Customer sensitive data in these data sources are often copied from Production to non-production test
beds in their original form for test and analysis purpose. This imposes security challenge by breaching
the privacy of customer sensitive data. To counter the privacy breach and protect customer sensitive
data to comply with legal requirements, customer sensitive data needs to be redacted before being
copied to non-production environment.
Ab Initio, the ETL tool is identified to achieve the goal, by developing an Ab Initio based framework
which can automate the process of data extractions and transformations based on the user requests/
requirements by redacting sensitive data elements.Ab Initio based framework is used to serve ongoing
process as single service for data transformations across all Citigroup application identified as sensitive.
Responsibilities:

Created different generic graphs to move the data from different sources and apply business logic and
load them into target databases.
Created graphs to generate DMLs dynamically for Oracle, SQL Server.
Created graphs to generate XFRs dynamically for all the Applications in Oracle, SQL Server.
Created Abinitio Library package for defining various Redaction functions.
Created Operational Graph's and scripts for Oracle, SQL Server, systems files for redacting the data
from Source to Target.
Extensively used EME for creating different Projects and for storing input request information of various
applications.
Used EME as an Version control tool to perform Check-in and Check-out.
Involved in unit testing, System testing and defect fixing.
Prepared documents to explain the whole operations involved in redacting for all the applications.
Developed Ab Initio graphs for data validation using validation components.
Done the documentation work for all the deliverables weekly wise and took responsibility to complete
the given task as per the time deadlines.
Written wrapper scripts to call the generic graphs for Automation of redaction Process.
Written Abinitio library functions for transformation of data for redacting customer sensitive data.
Written Autosys jil scripts to schedule redaction jobs in productions.
Environment: Ab Initio Co>OS 2.14.128, GDE 1.14.37, Oracle, SQL Server, Flat files, Windows XP,
UNIX Servers, Autosys.

Sr. Ab Initio consultant


Pepsi co - Plano, TX
June 2009 to December 2010
iFramework and Services developed in the EDS project were deployed for actual use in R7 with
enhancements. Legacy data from multiple sources are being migrated to SAP. Sources were identified
and strict timelines were defined for the migration process. SAP would be the system of record after
successful live proving of the SAP implementation. The migration is being rolled out in phases/releases.
R7 migrates the FLNA data replacing the FLNA legacy machines. Conversion of Sources and subject
Areas would use the iFramework. SAP would be an interface to the hub just like any other
Data repository seeking information following the service development.
Since the data migration to SAP is ongoing it is easy to build an interface around as at this point where
not much sources have been migrated to SAP yet. Target logical model: PELDM, physical model PEDW
and subject areas of Items, Customers, Routes, Lists, etc (a total of 45 subject areas).
Objectives of the project:
Services to load, populate, manage and use the Enterprise data warehouse and/or the Master Data Hub
Further to the conversion of data, use SOA concepts to build and deliver data as a service using Ab
Initio.
SOA based service development for services called in webdyn pro using Ab Initio
Perform As-is Analysis in partnership with Legacy, DMO, and Process Team Subject Matter Experts.
Enter metadata into EME repository
Profile Source & target data in partnership with Legacy, DMO, and Process Team Subject Matter
Experts. Enter metadata into EME repository.
Perform Profiling, Perform Mapping, Create Rule sets, Create Conversions using iFramework Toolset
and approved processes and techniques.
Facilitate mapping work sessions to map in scope attributes from source applications to their target
(SAP)
Deliver FMDs in partnership with the Process Teams, DMO, and Legacy embeds. Check into StarTeam.
Deliver Technical Design Documents. Check into StarTeam.
Create Conversion Process Flow Diagrams and relevant Conversion documentation.

Create & deliver Conversion objects.


Support Conversion Objects through Unit Testing, Assembly Testing, Mock, and Go Live.
Deliver FRE files to DMO
Partner with Process Teams, the DMO, Legacy Teams, R&T, throughout the delivery of conversion
objects
Provide SME consulting to the Process , DMO, Legacy Teams
As a SME, understand the Business relationships and how they relate to the data.
Grow depth of subject area expertise in at least one functional area in support of PepsiCo's
transformational effort
Partner with technical leads to ensure that all deliverables are aligned, scheduled, are on target, and
are delivered.
Mentor team members in area of expertise, best practices, standards, tools, and techniques
Perform weekly administrative tasks on time with accuracy - Status Reports, Metrics, Entering Actuals
in PWA, 1UP time if applicable
Ensure that all testing deliverables are met and ready for ATPs, approvals, delivery
Define and create relevant testing documents, such as test plans, test cases, and test inventory per
the project methodology
Document minutes of all meetings that you drive, and post to eRoom for formal record of all attendees,
discussions, decisions, and issues.
Environment: Ab Initio GDE 1.15.6.1, 1.16.1.1.3 Co>Operating System, 2.15.29, 2.16.1 EME, Business
Rules Environment 1.15.3.0, Ab Initio Data Profiler Version 1.7.10, EME Management Console
1.15.10.2, UNIX sun4u OS version: 5.8, Oracle 10g, 9i, Tara data V2 R5, Toad 7.11, 9.0, BO XI R2.

Sr. Ab Initio consultant


Fair Isaac - Chicago, IL
March 2007 to May 2009
Fair Isaac facilitates end-to-end consumer marketing/database operations process from program
planning and definition through to analysis and insight development. FIC is the hub for program and
campaign planning and execution. This project involves collaboration with Shire/brand teams and
agency/partners/ Fair Isaac to building Consumer Database Marketing and Operations for Shire to
develop consumer marketing database system, associated processes and reporting to enable closedloop marketing for shire's brand programs to leverage and optimize existing 3rd party vendor hosted
data, to help coordinate with multiple vendors supporting brands to operate as the hub for all consumer
marketing programs, data and fulfillment processing and to create a solution and approach that is
extensible to integrate other brands, future customer segments dependable and responsive operations
that support dynamic market environment.
Responsibilities:
Involved in PDP and Analyze sessions. Gathered business requirements and provided solutions.
Implemented the Ab-Initio environment and serve as a Single Point of Contact for technical issues on
Ab Initio.
Prepared high level design documents.
Built graphs to implement complex business requirements.
Prepared documents to schedule jobs in production. Coordinated with Production support team to
resolve production issues.
Created and maintained database and application requirements and design information working with
the Data Modelers.
Diagnosed production issues to ensure a timely recovery and minimal impact to our business users.
Validated Unit Test Cases and ensure data elements are being properly tested.
Fine tuned existing graphs to reduce CPU and memory consumption.
Participated in Code Reviews and Code Turnovers as an Author, Reviewer, or Moderator.

Migrated ETL and SQL programs to QA and prepared appropriate docs for code move to QA and
production.
Worked with third-party consultants and experts to accomplish specific projects and tasks.
Helped identify gaps and areas for improvement to enhance the state of current processes. Maintained
the integrity of our systems as well as the stability and reliability of supported processes.
Recommended changes to the architectural data schema as needed.
Scheduled the jobs in autosys and supported them in production.
Written UNIX shell scripts for testing the application and writing the SQL queries for Oracle database.
Maintained backup schedules for database storage. Read and interpreted UNIX logs.
Environment: Ab Initio GDE 1.13.29, 1.14.39, Co>Operating System 2.13.8, 2.14.112, UNIX sun4u OS
version: 5.8, Oracle 10g, 9i, Toad 7.11, 9.0, CA- Autosys, Data Migrator, BO XI R2.

Ab Initio Consultant
Aetna Insurance, Middle town, CT
July 2006 to February 2007
Aetna Health insurance is Prescription management and health information. This process loads client
and standard data into a warehouse database. The data is thoroughly checked for validity during
processing. Audit data is created which characterize the data in various categories; this data is used in
reports which document the quality of the data as well as the quality of load process. These reports also
help to identify changes in the client's data over time. Client-specific monitoring allows the load process
to be interrupted for review or correction of the data.
Responsibilities:
Used Ab Initio as ETL tool to pull data from source systems, cleanse, transform, and load data into
databases
Tuning of Ab Initio Graphs for better performance.
Developed Complex Ab Initio XFR s to derive new fields and solves rigorous business requirements.
Customizing the external credit fraud protection data to the new physical data model.
Developed shell scripts to customize the ETL Ab Initio Graphs at runtime.
Worked with Business Users to define the functional metrics for the external credit fraud protection
data customization.
Automated the Data Loads using UNIX shell scripting for Production, Testing and development
environment.
Implemented Data Parallelism through graphs, which deals with data, divided into segments and
operates on each segment simultaneously through the Ab Initio partition components to segment data.
Performing transformations of source data with Transform components like Join, Match Sorted, Dedup
Sorted, Demoralize, Reformat, Filter-by- Expression.
Create Summary tables using Rollup, Scan & Aggregate.
Wide usage of Lookup Files while getting data from multiple sources and size of the data is limited.
Write and modify several application specific Config scripts in UNIX in order to pass the Environment
variables.
Tuning of Ab Initio Graphs for better performance.
Coding Graphs to extract data from COBOL files.
Developed Unix Korn shell wrapper scripts to accept parameters and scheduled the processes using
Zeak. Extensive usage of Mega death to load the data from flat files to the table.
Developed a script which automated the loading process to the DB2 database.
Worked closely with Cognos Planning Contributor.
Environment: Ab Initio GDE 1.13.26/1.11.6, Co>Operating System2.11.7/ 2.13.8, UNIX, Zeak, TSO
Mainframe, DB2, COBOL, SQL server, Cognos.

Ab Initio consultant
4

Capital one auto finance - Dallas, TX


March 2006 to June 2006
Description: The project involved corporate data warehouse (CDW) for Capital One. The corporate data
warehouse was designed to create a single point of reference to extract customer information from
various sources. This system helped the customer service representatives to deal and transact with
the customers more efficiently and promptly. The data marts were basically used to produce summary
reports / web reports. Many custom UNIX scripts, Ab Initio graphs, PL/SQL programs were developed
to provide a best-fit solution.
Responsibilities:
Worked with Data Architect to interpret new data warehouse models and mapping documentation.
Worked with project analyst(s) to accurately interpret requirements and develop solutions to meet
specific deliverables.
Built graphs necessary to retrieve the necessary data from specified data sources based on functional
and technical requirements.
Incorporated Error handling components/sub-graphs from existing environment.
Incorporated Master-logging component/sub-graphs from existing environment.
Made recommendations for Testing scripts & scenarios based on Functional & Business requirements
and solution design.
Provided status reporting and development progress to Project Manager.
Validated processes and architecture of the overall Data Warehouse applications.
Environment: Ab initio 1.13.6, sql server, clear case, Oracle 10g, Solar Unix

Ab Initio consultant
Walgreen's - Chicago, IL
November 2005 to February 2006
Description: 340B is Government sponsored program that allow eligible clients matched with a pharmacy
(Walgreen's & Non Walgreen's) to prescribe scripts at a much lower cost. Clinics must meet certain
criteria to be eligible and stock replenishments are ordered to the drug wholesaler. The project goal is
to provide Client and accounting reports for 340B claims and automate the drug replenishment process
to drug wholesaler. Plans will be set up in PBS that will have 340B component. Claims from PBS will be
transmitted to PBM/DSS where reporting and order replenishment will take place.
Responsibilities: Worked with the DW development teams to define new and innovative ways of
representing and analyzing data for implementation in the DW application environments.
Defined data requirements and design associated with the implementation of various Data Mart
initiatives.
Created and maintain database and application requirements and design information working with the
Data Modelers utilizing CASE tools and logical models.
Worked with the database and application team members to ensure implementation of the intended
design.
Teamed with Business Intelligence team to ensure delivery of intended business meaning.
Work with the Data Management group to create data administration QA standards.
Resolved data issues and facilitate data analysis.
Provided production support of the existing applications (on call as necessary).
Performed daily maintenance and administration of DW production runs.
Served as point-of-contact for tech problem resolution.
Troubleshooted SQL issues within escalation guidelines.
Participated as a tech resource as needed for customer issue reviews.
Gained a thorough understanding of assigned applications with intimate knowledge of customer
business purpose, operations, systems configuration, and application software environment.

Stayed current and proficient on existing and emerging DW and ETL technology through self-study and
company- and vendor-provided training opportunities.
Executed all non-revenue-based change and shared infrastructure change as needed.
Conducted quality assurance walkthroughs.
Provided status on a regular basis.
Environment: Ab initio 1.13.6, Oracle 9i, Solar Unix

Ab Initio Consultant
XEROX - Portland, OR
June 2005 to November 2005
Description: The core project involved an overall migration from multiple data sources from the oracle 11i
applications to Oracle data warehouse to facilitate efficient decisions in areas like retailing and customer
information processing. The other phase involved extraction, transformation and loading of data to the
dimension and fact tables and flat files. The data from different operational source were extracted,
transformed and finally loaded into various dimensional and fact tables.
Responsibilities: Involved the Data movement design and development for legacy data to a newly
defined data model for the data warehouse.
Loaded the fact and dimension tables focused on performance.
Implemented the Ab-Initio environment, which connects to a central EME Repository.
Designed and deployed Optimized Ab Initio graphs for High Volume transformations.
Implemented the aggregation of same data across different tables in the legacy system.
Wide usage of Lookup Files while getting data from multiple sources and size of the data is limited.
Phase two involved design and development for validating the legacy data with external data providers,
which included credit fraud protection.
Used Ab Initio data cleansing functions like is valid, is defined, is error, is defined, string substring,
string_concat and other String, Date, Inquiry and Miscellaneous functions
Tuning of Ab Initio Graphs for better performance.
Developed Complex Ab Initio XFR s to derive new fields and solves rigorous business requirements.
Customizing the external credit fraud protection data to the new physical data model.
Developed shell scripts to customize the ETL Ab Initio Graphs at runtime.
Worked with Business Users to define the functional metrics for the external credit fraud protection
data customization.
Worked extensively with EME Repository.
Automated the Data Loads using UNIX shell scripting for Production, Testing and development
environment.
# Implemented Data Parallelism through graphs, which deals with data, divided into segments and
operates on each segment simultaneously through the Ab Initio partition components to segment data.
Environment: Oracle 8i, Ab Initio CO>OS 2.13, Ab Initio GDE 1.13, EME, UNIX, HP superdome/RP
8400, and Windows NT/2000

AB Initio Consultant
Wells Fargo - Sunnyvale, CA
March 2004 to May 2005
Description: Wells Fargo receives the import files from mortgage insurer, and sends files to various
insurers. The flexibility of the file formats is achieved by not having multiple defined import files, but
by having a single type of import file with multiple defined record types. Generated reports to elicit the
information for loan selling and buying decisions in Micro Strategy.
Responsibilities: Designed and maintained a modeling strategy using Star Schema, created several
Materialized Views for approaching Incremental Strategy and Update Strategy

Implemented Snowflake Schema Model after performing the information usage analysis.
Replicate operational tables into staging tables, Transform and load data into warehouse tables using
AB Initio GDE and responsible for automating the ETL process through scheduling and exceptionhandling routines
Developed AB Initio ETL process for Vendor, Circuit and Fact table loading and interpret the
transformation rules for all target data objects and develop the software components to support the
transformation as well as estimating task duration
Developed and supported the Extraction, Transformation and Loading process (ETL) for a Data
Warehouse from their legacy systems using AB Initio and provide technical support and hands-on
mentoring in the use of AB Initio
Prepared and implemented data verification and testing methods for the Data Warehouse as well as to
design and implement data staging methods and stress testing of ETL routines to make sure that they
don't break on heavy loads
Created AB Initio multifile system shell scripts on different nodes which can process multiple partitions
of data at the same time using AB Initio's M_type and MP_type shell commands
Written several Test Plans, Test Cases, and Test Procedures
Converted user defined functions of business process into AB Initio user defined functions.
Understand SAS programming codes and build architecture diagrams using AB Initio
Responsible for technical documentation of database schema and reports
Responsible for Quality Assurance & Quality Control based on CMM (Capability Maturity Model)
Environment: AIX UNIX, Oracle 8.1.3, AB Initio 1.10.11/2.10.11, TOAD 6.0, ERWIN 4.0

Data Warehousing Specialist


Starbucks - Seattle, WA
March 2003 to February 2004
The project was implemented to analyze the business performance, trends over time and build forecasts
in addition to generating various reports for corporate users and general users. The project involved
processing (validating, profiling and formatting) of the legacy data to be loaded into the data warehouse
Responsibilities
Implemented various Dimensions such as Time period, product, Locations, Age Group, Econ Class
and Gender that are tied to Sales analysis central fact table.
Implemented Snowflake Schema Model after performing the information usage analysis.
Extensively used Informatica tool in data staging area.
Transformed and loaded the data into the Teradata Target Database.
Involved in Information usage analysis, dimension analysis and designing of schema
Identified the data sources for data cleansing and data movement.
Data Transformation Transaction Analysis and maintaining of data quality.
Extensively used reject file utility to handle, correct and load data.
Created and used Dynamic Data Store to enforce standards
Environment: AIX UNIX, Oracle 8.1.3, AB Initio 1.1.08

Parts Manufacture Analyst


General Motors - Detroit, MI
April 2002 to February 2003
This Decision Support System is developed for analyzing and reporting the historical, time variant data
of Order history, Customer History, Parts shipments history, Parts Assembly history, Vendor/supplier
history, Parts Bill of Material for GM motors. Data is extracted from different sources (DB2 and Oracle)
and sent in flat files format to data warehouse environment for data transformation and loading in target
schema (Teradata database). Snowflake schema model was implemented as logical central data store.

Responsibilities:
Involved in the design of Data mart as part of the Data warehouse
Designed and Implemented ETL architecture with Informatica tool
Automated the process of identifying source files and bringing the source files into Informatica
environment using shell scripts.
Configured Target Load Orders to load the targets by different sources in particular order.
Configured the sessions to handle the updates to preserve the existing records.
Handled rejected data, corrected the rejected file and loaded in to target using reject- File-utility
Environment: Informatica Power Mart 4.7 (Source Analyzer, Data warehousing designer, Mapping
Designer, Mapplet, Transformations), Informatica Power Center 1.5, DB2, Oracle 8.0, ETL, SQL, PL/
SQL, Windows 98/NT

Oracle/Web Developer
Grind lays Bank - Bangalore, Karnataka
January 2001 to March 2002
Responsibilities: Designed and Developed Material Management system and Purchase Order system
for different clients of our organization. All the projects are developed using Developer 6/2000(Forms
6/4.5, PL/SQL packages/procedures and reports 6/2.5) with Oracle as backend.
Used design tool Designer 2000/ERWIN in the design/development stage.
Provided technical support to the Payroll System that's running at our office and also used to
troubleshoot problems in application systems at all our client locations.
Wrote maintenance screens using Forms 4.5, developed simple adhoc reports using SQL*Plus and
Reports 2.5.
Created database schema objects like tables, views, synonyms, indexes, etc.
Prepared the strategy for query and database optimization using the database utilities EXPLAIN PLAN.
Carried out the logical backup of the database (EXPORT) in Full.
Worked extensively on Logical and Physical design of the database in modular strategy for which CASE
tool D2k was employed.
Worked on reverse-engineering to get the diagram out of the current physical database structure and
worked on data diagram/ERD.
Followed and carried out the normalization (and de-normalization techniques) during the design phase
of the system.
Involved in the database design, creation of database level triggers & procedures and export / import
of the tables.
Environment: JAVA, Oracle 7.x/8.x, Visual Basic 5.0, Developer 6/2000, Designer 2000, PL/SQL, Erwin

Oracle Developer
Citadel - Hyderabad, Andhra Pradesh
April 2000 to December 2000
Actively participated in the requirement specification study and communication with the different user
departments.
Involved in the design of tables, forms and reports.
Developed various triggers, functions and stored procedures in PL/SQL.
Coded many shell scripts for efficient job scheduling.
Responsible for creating Oracle PL/SQL Packages, Stored Procedures and Functions using PL/SQL
handle daily and weekly uploads and reporting procedures.
Responsibility includes maintaining, and enhancing the programs.
Environment: Oracle 7.3, SQL, PL/SQL and custom packages

Education

Masters in Computer Applications


Madras University - Chennai, Tamil Nadu

Additional Information

TECHNICAL SKILLS:
Data Warehousing Tools: AB Initio 1.11.5, 1.12.1, 1.13, 1.14.35, 1.15.6.1, 1.16.1.3, Business Rules
Environment 1.15.3.0, Informatica Power Center 4.7/5.0.
Data Modeling: Erwin 3.5, Star-Schema Modeling, Snowflakes Modeling, FACT, Dimension Tables, Data
Marts.
Tools: Toad 7.1, 9.0, SQL*Loader, SQL*Plus, CA- Auto sys,
GUI: Developer2000 (Forms4.5/5.0, Reports 2.5/3.0).
Languages: SQL, PL/SQL, C, C++, Visual Basic, UNIX Shell Scripting, Bourne Shell, Korn Shell, HTML.
RDBMS: Oracle 10g,9i/8i/8.0/7.x, MS SQL Server 6.5/7.0/2000, DB2 6.1 (UDB), NCR Taradata.
Operating Systems: Sun Solaris, MVS, HP-UX, IBM AIX, Linux, Novell NetWare, Sun-Ultra, Sun-Spark,
Windows NT/2000/98/95, IBM ES-9000, OS/2.

You might also like