Professional Documents
Culture Documents
Work Experience
Created different generic graphs to move the data from different sources and apply business logic and
load them into target databases.
Created graphs to generate DMLs dynamically for Oracle, SQL Server.
Created graphs to generate XFRs dynamically for all the Applications in Oracle, SQL Server.
Created Abinitio Library package for defining various Redaction functions.
Created Operational Graph's and scripts for Oracle, SQL Server, systems files for redacting the data
from Source to Target.
Extensively used EME for creating different Projects and for storing input request information of various
applications.
Used EME as an Version control tool to perform Check-in and Check-out.
Involved in unit testing, System testing and defect fixing.
Prepared documents to explain the whole operations involved in redacting for all the applications.
Developed Ab Initio graphs for data validation using validation components.
Done the documentation work for all the deliverables weekly wise and took responsibility to complete
the given task as per the time deadlines.
Written wrapper scripts to call the generic graphs for Automation of redaction Process.
Written Abinitio library functions for transformation of data for redacting customer sensitive data.
Written Autosys jil scripts to schedule redaction jobs in productions.
Environment: Ab Initio Co>OS 2.14.128, GDE 1.14.37, Oracle, SQL Server, Flat files, Windows XP,
UNIX Servers, Autosys.
Migrated ETL and SQL programs to QA and prepared appropriate docs for code move to QA and
production.
Worked with third-party consultants and experts to accomplish specific projects and tasks.
Helped identify gaps and areas for improvement to enhance the state of current processes. Maintained
the integrity of our systems as well as the stability and reliability of supported processes.
Recommended changes to the architectural data schema as needed.
Scheduled the jobs in autosys and supported them in production.
Written UNIX shell scripts for testing the application and writing the SQL queries for Oracle database.
Maintained backup schedules for database storage. Read and interpreted UNIX logs.
Environment: Ab Initio GDE 1.13.29, 1.14.39, Co>Operating System 2.13.8, 2.14.112, UNIX sun4u OS
version: 5.8, Oracle 10g, 9i, Toad 7.11, 9.0, CA- Autosys, Data Migrator, BO XI R2.
Ab Initio Consultant
Aetna Insurance, Middle town, CT
July 2006 to February 2007
Aetna Health insurance is Prescription management and health information. This process loads client
and standard data into a warehouse database. The data is thoroughly checked for validity during
processing. Audit data is created which characterize the data in various categories; this data is used in
reports which document the quality of the data as well as the quality of load process. These reports also
help to identify changes in the client's data over time. Client-specific monitoring allows the load process
to be interrupted for review or correction of the data.
Responsibilities:
Used Ab Initio as ETL tool to pull data from source systems, cleanse, transform, and load data into
databases
Tuning of Ab Initio Graphs for better performance.
Developed Complex Ab Initio XFR s to derive new fields and solves rigorous business requirements.
Customizing the external credit fraud protection data to the new physical data model.
Developed shell scripts to customize the ETL Ab Initio Graphs at runtime.
Worked with Business Users to define the functional metrics for the external credit fraud protection
data customization.
Automated the Data Loads using UNIX shell scripting for Production, Testing and development
environment.
Implemented Data Parallelism through graphs, which deals with data, divided into segments and
operates on each segment simultaneously through the Ab Initio partition components to segment data.
Performing transformations of source data with Transform components like Join, Match Sorted, Dedup
Sorted, Demoralize, Reformat, Filter-by- Expression.
Create Summary tables using Rollup, Scan & Aggregate.
Wide usage of Lookup Files while getting data from multiple sources and size of the data is limited.
Write and modify several application specific Config scripts in UNIX in order to pass the Environment
variables.
Tuning of Ab Initio Graphs for better performance.
Coding Graphs to extract data from COBOL files.
Developed Unix Korn shell wrapper scripts to accept parameters and scheduled the processes using
Zeak. Extensive usage of Mega death to load the data from flat files to the table.
Developed a script which automated the loading process to the DB2 database.
Worked closely with Cognos Planning Contributor.
Environment: Ab Initio GDE 1.13.26/1.11.6, Co>Operating System2.11.7/ 2.13.8, UNIX, Zeak, TSO
Mainframe, DB2, COBOL, SQL server, Cognos.
Ab Initio consultant
4
Ab Initio consultant
Walgreen's - Chicago, IL
November 2005 to February 2006
Description: 340B is Government sponsored program that allow eligible clients matched with a pharmacy
(Walgreen's & Non Walgreen's) to prescribe scripts at a much lower cost. Clinics must meet certain
criteria to be eligible and stock replenishments are ordered to the drug wholesaler. The project goal is
to provide Client and accounting reports for 340B claims and automate the drug replenishment process
to drug wholesaler. Plans will be set up in PBS that will have 340B component. Claims from PBS will be
transmitted to PBM/DSS where reporting and order replenishment will take place.
Responsibilities: Worked with the DW development teams to define new and innovative ways of
representing and analyzing data for implementation in the DW application environments.
Defined data requirements and design associated with the implementation of various Data Mart
initiatives.
Created and maintain database and application requirements and design information working with the
Data Modelers utilizing CASE tools and logical models.
Worked with the database and application team members to ensure implementation of the intended
design.
Teamed with Business Intelligence team to ensure delivery of intended business meaning.
Work with the Data Management group to create data administration QA standards.
Resolved data issues and facilitate data analysis.
Provided production support of the existing applications (on call as necessary).
Performed daily maintenance and administration of DW production runs.
Served as point-of-contact for tech problem resolution.
Troubleshooted SQL issues within escalation guidelines.
Participated as a tech resource as needed for customer issue reviews.
Gained a thorough understanding of assigned applications with intimate knowledge of customer
business purpose, operations, systems configuration, and application software environment.
Stayed current and proficient on existing and emerging DW and ETL technology through self-study and
company- and vendor-provided training opportunities.
Executed all non-revenue-based change and shared infrastructure change as needed.
Conducted quality assurance walkthroughs.
Provided status on a regular basis.
Environment: Ab initio 1.13.6, Oracle 9i, Solar Unix
Ab Initio Consultant
XEROX - Portland, OR
June 2005 to November 2005
Description: The core project involved an overall migration from multiple data sources from the oracle 11i
applications to Oracle data warehouse to facilitate efficient decisions in areas like retailing and customer
information processing. The other phase involved extraction, transformation and loading of data to the
dimension and fact tables and flat files. The data from different operational source were extracted,
transformed and finally loaded into various dimensional and fact tables.
Responsibilities: Involved the Data movement design and development for legacy data to a newly
defined data model for the data warehouse.
Loaded the fact and dimension tables focused on performance.
Implemented the Ab-Initio environment, which connects to a central EME Repository.
Designed and deployed Optimized Ab Initio graphs for High Volume transformations.
Implemented the aggregation of same data across different tables in the legacy system.
Wide usage of Lookup Files while getting data from multiple sources and size of the data is limited.
Phase two involved design and development for validating the legacy data with external data providers,
which included credit fraud protection.
Used Ab Initio data cleansing functions like is valid, is defined, is error, is defined, string substring,
string_concat and other String, Date, Inquiry and Miscellaneous functions
Tuning of Ab Initio Graphs for better performance.
Developed Complex Ab Initio XFR s to derive new fields and solves rigorous business requirements.
Customizing the external credit fraud protection data to the new physical data model.
Developed shell scripts to customize the ETL Ab Initio Graphs at runtime.
Worked with Business Users to define the functional metrics for the external credit fraud protection
data customization.
Worked extensively with EME Repository.
Automated the Data Loads using UNIX shell scripting for Production, Testing and development
environment.
# Implemented Data Parallelism through graphs, which deals with data, divided into segments and
operates on each segment simultaneously through the Ab Initio partition components to segment data.
Environment: Oracle 8i, Ab Initio CO>OS 2.13, Ab Initio GDE 1.13, EME, UNIX, HP superdome/RP
8400, and Windows NT/2000
AB Initio Consultant
Wells Fargo - Sunnyvale, CA
March 2004 to May 2005
Description: Wells Fargo receives the import files from mortgage insurer, and sends files to various
insurers. The flexibility of the file formats is achieved by not having multiple defined import files, but
by having a single type of import file with multiple defined record types. Generated reports to elicit the
information for loan selling and buying decisions in Micro Strategy.
Responsibilities: Designed and maintained a modeling strategy using Star Schema, created several
Materialized Views for approaching Incremental Strategy and Update Strategy
Implemented Snowflake Schema Model after performing the information usage analysis.
Replicate operational tables into staging tables, Transform and load data into warehouse tables using
AB Initio GDE and responsible for automating the ETL process through scheduling and exceptionhandling routines
Developed AB Initio ETL process for Vendor, Circuit and Fact table loading and interpret the
transformation rules for all target data objects and develop the software components to support the
transformation as well as estimating task duration
Developed and supported the Extraction, Transformation and Loading process (ETL) for a Data
Warehouse from their legacy systems using AB Initio and provide technical support and hands-on
mentoring in the use of AB Initio
Prepared and implemented data verification and testing methods for the Data Warehouse as well as to
design and implement data staging methods and stress testing of ETL routines to make sure that they
don't break on heavy loads
Created AB Initio multifile system shell scripts on different nodes which can process multiple partitions
of data at the same time using AB Initio's M_type and MP_type shell commands
Written several Test Plans, Test Cases, and Test Procedures
Converted user defined functions of business process into AB Initio user defined functions.
Understand SAS programming codes and build architecture diagrams using AB Initio
Responsible for technical documentation of database schema and reports
Responsible for Quality Assurance & Quality Control based on CMM (Capability Maturity Model)
Environment: AIX UNIX, Oracle 8.1.3, AB Initio 1.10.11/2.10.11, TOAD 6.0, ERWIN 4.0
Responsibilities:
Involved in the design of Data mart as part of the Data warehouse
Designed and Implemented ETL architecture with Informatica tool
Automated the process of identifying source files and bringing the source files into Informatica
environment using shell scripts.
Configured Target Load Orders to load the targets by different sources in particular order.
Configured the sessions to handle the updates to preserve the existing records.
Handled rejected data, corrected the rejected file and loaded in to target using reject- File-utility
Environment: Informatica Power Mart 4.7 (Source Analyzer, Data warehousing designer, Mapping
Designer, Mapplet, Transformations), Informatica Power Center 1.5, DB2, Oracle 8.0, ETL, SQL, PL/
SQL, Windows 98/NT
Oracle/Web Developer
Grind lays Bank - Bangalore, Karnataka
January 2001 to March 2002
Responsibilities: Designed and Developed Material Management system and Purchase Order system
for different clients of our organization. All the projects are developed using Developer 6/2000(Forms
6/4.5, PL/SQL packages/procedures and reports 6/2.5) with Oracle as backend.
Used design tool Designer 2000/ERWIN in the design/development stage.
Provided technical support to the Payroll System that's running at our office and also used to
troubleshoot problems in application systems at all our client locations.
Wrote maintenance screens using Forms 4.5, developed simple adhoc reports using SQL*Plus and
Reports 2.5.
Created database schema objects like tables, views, synonyms, indexes, etc.
Prepared the strategy for query and database optimization using the database utilities EXPLAIN PLAN.
Carried out the logical backup of the database (EXPORT) in Full.
Worked extensively on Logical and Physical design of the database in modular strategy for which CASE
tool D2k was employed.
Worked on reverse-engineering to get the diagram out of the current physical database structure and
worked on data diagram/ERD.
Followed and carried out the normalization (and de-normalization techniques) during the design phase
of the system.
Involved in the database design, creation of database level triggers & procedures and export / import
of the tables.
Environment: JAVA, Oracle 7.x/8.x, Visual Basic 5.0, Developer 6/2000, Designer 2000, PL/SQL, Erwin
Oracle Developer
Citadel - Hyderabad, Andhra Pradesh
April 2000 to December 2000
Actively participated in the requirement specification study and communication with the different user
departments.
Involved in the design of tables, forms and reports.
Developed various triggers, functions and stored procedures in PL/SQL.
Coded many shell scripts for efficient job scheduling.
Responsible for creating Oracle PL/SQL Packages, Stored Procedures and Functions using PL/SQL
handle daily and weekly uploads and reporting procedures.
Responsibility includes maintaining, and enhancing the programs.
Environment: Oracle 7.3, SQL, PL/SQL and custom packages
Education
Additional Information
TECHNICAL SKILLS:
Data Warehousing Tools: AB Initio 1.11.5, 1.12.1, 1.13, 1.14.35, 1.15.6.1, 1.16.1.3, Business Rules
Environment 1.15.3.0, Informatica Power Center 4.7/5.0.
Data Modeling: Erwin 3.5, Star-Schema Modeling, Snowflakes Modeling, FACT, Dimension Tables, Data
Marts.
Tools: Toad 7.1, 9.0, SQL*Loader, SQL*Plus, CA- Auto sys,
GUI: Developer2000 (Forms4.5/5.0, Reports 2.5/3.0).
Languages: SQL, PL/SQL, C, C++, Visual Basic, UNIX Shell Scripting, Bourne Shell, Korn Shell, HTML.
RDBMS: Oracle 10g,9i/8i/8.0/7.x, MS SQL Server 6.5/7.0/2000, DB2 6.1 (UDB), NCR Taradata.
Operating Systems: Sun Solaris, MVS, HP-UX, IBM AIX, Linux, Novell NetWare, Sun-Ultra, Sun-Spark,
Windows NT/2000/98/95, IBM ES-9000, OS/2.