Post Job Free
Sign in

Data Analyst Quality

Location:
New Haven, CT
Salary:
78000
Posted:
April 21, 2025

Contact this candidate

Resume:

Professional Experience

Client: Travelers, Connecticut, USA Nov 2023 – Present

Role: Data Analyst

Description: Travelers is a leading provider of property and casualty insurance products and services for businesses and individuals. Played a key role in developing and optimizing data solutions, creating reports, and automating data quality processes to support effective decision-making.

Responsibilities:

Create scripts and programs to gain an understanding of data sets, discover data quality and data integrity issues associated with the analytical data and perform root cause analysis for those issues Write complex SQL scripts to analyze data present in different Databases/Data warehouse’s like Snowflake, Teradata, Redshift.

Perform segmentation analytics for each campaign using database technologies present both on premise (such as SQL, Teradata, UNIX) and on Cloud platform using AWS technologies and Big Data technologies such as Spark, Python and Databricks Created a multi-page report dashboard in Google Looker (Data Studio) using Google Big Query data.

Proficiency in SQL, Python, or other scripting languages used in data transformation and control flow activities.

Create custom reports and dashboards using business intelligence software's like Tableau and Quick Sight to present data analysis and conclusions create automated solutions using Databricks, Spark Python, Snowflake, HTM L.

Create programs using python to read real time data from SOP (Streaming Data Platform) and perform analysis and load the data to Analytical Cloud Data warehouse migrate scripts and programs from On - Prem environment to AWS Cloud environment.

Developed and automated financial planning reports using Python and Snowflake, supporting month-end close and forecasting cycles.

Built and optimized multi-page dashboards in Google Looker and QuickSight, integrating data from Google BigQuery and AWS Redshift, analogous to SAP SAC stories and planning models

Created ETL pipelines using Databricks and Spark, similar to SAC data actions and scripting capabilities.

Implemented automated data quality alerts using Python + Slack API, ensuring data reliability across financial datasets.

Experience administering data security using access policies for sensitive financial data, aligned with SAC access controls.

Worked on different data models for Claims Members and Providers for different claim types for different Health Partner Incentive programs. Creating pipelines with GUI in Azure data Factory(ADF)

Migrating campaigns from Unica Affinium campaign marketing tool to Quantum.

Automate the process to send the data quality alerts to slack channel and email using Databricks, Python and HTML. This will alert users if there are any issues with data.

Perform data comparison between SDP (Streaming Data Platform) real time data with AWS S3 data and Snowflake data using Databricks, Spark SQL and Python.

Advanced information management and new data processing techniques may be applied to extract the value locked up in this data called Hadoop (HDFS) along with processing large data sets in parallel across a Hadoop cluster and the utilization of Hadoop MapReduce framework.

Create and monitor production batch jobs which load analytical data to the data source tables on daily basis And fixing and re-executing jobs when there is a job failure.

Alert data consumers about the delays in data loads to the data sources/tables using Slack bot API integration with Python code Create batch programs using UNIX shell script and Teradata BTEQ.

Power BI integrates with other Microsoft tools like Azure, Excel, SharePoint, Teams, Power Automate, and Power Apps, enhancing data analysis and collaboration.

Developed ETL routines using SSIS packages.to plan an effective package development process and design the control flow within the packages Develop comprehensive dashboards and reports using tools like Tableau, Power BI, and Excel to present data-driven insights to stakeholders Worked on creating DDL DML scripts for the data models

Create fulfilment reports like Data and Business intent validation reports. Perform Ad-Hoc queries & extracting data from existing data stores.

Manage the entirety of campaign's logic, including audience segmentation, exclusions, and assignment of offers and channels.

Environment: Teradata V 13.10.Unica Affinium Campaign 8.1/8.017.517.2/6.0, Teradata SQL Assistant. BTEQ, UNIX, SQL Python Databricks, Snowflake, Redshift, AWS, Spark, Quantum, HTML.

Client: The Hartford, Connecticut, USA Nov 2022 - Oct 2023

Role: Data Analyst intern

Description: The Hartford is a large American insurance company that provides a wide range of insurance and financial services. Managed enterprise-wide data modeling, integration, and governance to support business intelligence solutions and ensure data consistency across multiple sources.

Responsibilities:

Responsible for technical data governance, enterprise-wide data modeling and database design and developed multidimensional data models to support Bl solutions developed as well as other common industry data from external systems.

Managed data integration from multiple sources into DOMO, ensuring data consistency and accuracy..

• Working with business partners and team members, gather and analyze requirements translating these into solutions for database designs supporting transactional system data integration reports, spread sheets, and dashboards.

Preparing reports extracts and integration of existing data as needed to support analysis Visualizations, and statistical models. Worked with Azure Data Factory (ADF) since it’s a great SaaS solution to compose and orchestrate Azure data services Integrate data from multiple databases and systems into Power BI for comprehensive analysis

• Derived logical data model from the physical data model through the manual reverse engineering process.

• Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL access on Hadoop data

• Worked with project management business teams, and departments to assess and refine requirements to design/develop Bl solutions using Tableau.

• Managed data modeling and integration across DOMO, Power BI, and Azure Data Factory, which can be translated into SAC model and data pipeline experience.

• Worked on multidimensional data models and supported planning dashboards for executive stakeholders.

• Designed STAR schemas for OLAP environments and implemented tabular data models used in budgeting/forecasting applications.

• Utilized Azure Data Factory to orchestrate data refresh schedules and pipelines—similar to SAC’s integration mechanisms.

• Researched and developed hosting solutions using Tableau and other 3rd party hosting and software as a service solution.

• Used SQL on the new AWS Databases like Redshift and Relation Data Services and worked with various RDBMS like Oracle 11g.SQL Server.

• Created SQL tables with referential integrity and developed SQL queries using SQL Server and Toad

• Worked with PivotTables working with up to 140 million-record multi-table, data sets in SQL (MS SQL Server, SAS Proc SQL, etc.) Power BI provides a rich set of visualizations and tools to turn raw data into meaningful insights, empowering users to make data-driven decisions.

• Created Tabular Data Models and implemented Tableau for POC in Share Point Environment

• Partnered directly with the Data Architect clients, ETL developers other technical data warehouse team members and database administrators to design and develop high performing databases and maintain consistent data element definitions

• Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.

• Created logical and physical data models and reviewed these models with the business team and data architecture team.

• Transformed Logical Data Model to Physical Data Model ensuring the Primary Key and Foreign key relationships in PDM Consistency of definitions of Data Attributes and Primary Index considerations.

• Created SQL scripts to find data quality issues and to identify keys data anomalies and data validation issues.

• Responsible for full data loads from production to AWS Redshift staging environment and responsible for creating Hive tables loading data and writing hive queries.

• Designed different types of STAR schemas for detailed data marts and plan data marts in the OLAP environment.

• Produced and enforced data standards and maintain a repository of data architecture artifacts and procedures.

• Provides architectures patterns, tooling choices and standards for master data and hierarchy life cycle management.

Environment: Power Pivot. SQL MS Excel, Facets Tableau CSV files. Hadoop, AWS Redshift. Python.XML files, Linux Teradata SQL Assistant Oracle12c.

Client: Jio, Mumbai, India Feb 2021 - Jul 2022

Role: Data Analyst

Description: Jio is a major Indian telecom company known for its mobile and broadband services. Contributed to developing and maintaining interactive dashboards and reports, addressing business requirements, and supporting data integration and analysis.

Responsibilities:

Analyzed requirements developed and debugged applications using Bl tools

Worked with end users/customers to understand business requirements recommended technical solutions and documented functional requirements.

Designed, developed tested and maintained Tableau reports based on user requirements.

A RANK Table calculation like Percentage difference Per cent of Total, Moving Average, YTD YOY, etc. for various measures/calculated fields.

Worked on view orientation Sizing Layout Colour, Fonts, Tool tips for building Interactive dashboards/reports.

Worked on year end activity for Database audit and performed year end activities for Business Panorama.

Preparing reports extracts and integration of existing data as needed to support analysis Visualizations, and statistical models. Worked with Azure Data Factory (ADF) since it’s a great SaaS solution to compose and orchestrate Azure data services Integrate data from multiple databases and systems into Power BI for comprehensive analysis

Designing Developing and providing technical solutions through Tableau reports based on user requirements.

Extensively worked on designing and creating tableau dashboards also, automated dashboards to help stakeholders in implementation of strategic planning in the organization.

Designed and deployed interactive dashboards using Tableau and Looker, representing KPI-driven stories and visualizations akin to SAC's "stories".

Created calculations and data models that handled YOY, YTD, moving averages, and dynamic filters, which are common in SAC reporting.

Performed forecasting and segmentation analysis in Tableau, bridging functional similarities with SAC's planning models and data actions.

Extensively worked on advance visualizations techniques in Tableau to assist business users in represent data visually to solve complex problems provided application support for all the applications of Business Panorama.

Worked with leadership management team for visualizing the key customer metrics driving better solution in consumer POD and supply chain.

Utilized various Tableau functions like LOD, WINDOWS SUM. WINDOWS AVG, conditional functions (IF, Nested IF CASE, etc.) Performed database audits and created reports in tableau based on the audit logs.

Created and maintained interactive dashboards and reports using Tableau and Looker to visualize key metrics and trends, enabling stakeholders to make informed decisions.

Scheduled data refresh on Tableau Server for weekly and monthly increments upon business requirement and ensured views and dashboards display changes in the data accurately.

Published dashboards to the tableau server and communicated with the end stakeholder for PMO Usage

Integrated projects in the server and organized workshops on the projects for better usage of the capability built in the dashboards Worked on administration tasks such as Setting permissions, managing ownerships and providing access to the users and adding them to the specific group.

Performed data loads based on weekly and monthly schedules and worked on various issues we received based on the tickets we received in IRIS.

Environment: Tableau 2018.3 (Desktop/Server), AWS Redshift, Alteryx. Talend Oracle SQL Work bench Control - M AWS, Jira Agile SSMS Microsoft Office Suite (Word Excel Macros Pivot tables PowerPoint SharePoint)

Client: Kotak Mahindra Bank, Mumbai, India Mar 2019 - Jan 2021

Role: Programmer Analyst intern

Description: Kotak Mahindra Bank is a leading Indian financial institution offering a wide range of banking and financial services. Involved in creating and optimizing business reports, performing data validations, and designing complex queries to ensure data integrity and meet client requirements.

Responsibilities:

Created Summary Reports and Tabular Reports to clients based on their business requirements Provided Data Analysis and Designed and developed various Business Reports.

Designed/Developed SQL scripts to move data from the staging tables to the target tables Fine-tuning of SQL to optimize the performance spool space usage and CPU usage

Performed data validations, Data integrity before delivering data to operations, financial analyst.

Involve in data cleansing and analysis, using pivot tables formulas (v-lookup and others), data validation conditional formatting and graph, and chart manipulation.

Created financial reports and dashboards using SQL and Excel Macros, focused on monthly close, audit, and reconciliation activities—closely related to SAC's forecasting and month-end close use cases.

Designed complex stored procedures and SQL scripts for data transformation—highlight experience working with SAP BPC-style backend logic.

Develop Complex SQL queries Stored Procedures, Packages Functions and Database Triggers Used different joins for creating views from multiple tables designed the order of flow for the execution of the jobs and scheduling the jobs.

Wrote several SQL Queries using Teradata SQL Assistant for Ad Hoc Data Pull request queried the database objects to validate the data quality Involve in performance tuning of slow running SQL Queries.

Involve in the development of PUSQL scripts for pre-& post-session processes to automate loads.

Worked with various Business users to gather reporting requirements and understand the intent of reports and attended meetings to provide updates on the status of the projects.

Analyzed report requirements and developing the reports by writing Teradata SQL Queries and using MS Excel, MS Access, PowerPoint and UNIX.

Environment: SAS, SQL Queries and MS Excel, MS Access, PowerPoint Tableau VB Scripting, Macros.

VAMSI KIRAN BOKKA

Technical Skills

Profile Summary

Education

junior Data Analyst with over 3years of experience in transforming complex data into actionable insights Skilled in data warehousing, dashboard creation, and predictive modeling. Looking to bring expertise in SQL, Tableau, and Python to drive data strategy and deliver measurable business impact

More than 5+ years of experience in design, development implementation, testing data analysis and reporting of applications using Teradata, Oracle, SQL, PL/SQL, Python ADF, Spark and UNIX.

•Expertise in full project life cycle development (SDLC) from initial system planning and technology acquisition through installation, training and operation Deep understanding of technology with focus on delivering business solutions.

•Experience in working on different Databases/Data warehouses like Teradata, Oracle, AWS Redshift, and Snowflake.

•Performed Data Analysis using SQL, PL/SQL, Python, Spark, Databricks, and Teradata SQL Assistant SQL server management studio, SAS.

•Strong knowledge and use of development methodologies, standards and procedures Looker can integrate with data science models developed in Python, R, or other environments,

•Proficient in writing Packages Stored Procedures, Functions, Views, Materialized views and Database Triggers using SQL and PL/SQL in Oracle.

•Developing and executing campaigns using Unica's Affinium Campaign tool.

•Building flowcharts from existing templates and also from scratch in unicas Affinium Campaign.

•Use Power BI supports a wide range of data sources, including SQL Server, Azure SQL Database, Excel, SharePoint, Google Analytics, Sales force,

•Use ADF’s Data Flow capabilities to perform data transformations such as filtering, sorting, joining, aggregating, and deriving new columns without needing to write complex code.

•use of custom visuals from the Power BI marketplace, providing additional flexibility for creating tailored visualizations, such as heat maps, spark lines, bullet charts, and more.

•Created automated solutions/tools using Python and javascript.

•Experience in Performance Tuning & Optimization of SQL statements.

Experience in leveraging DOMO for data visualization, reporting, and analytics.

•Performed QA/QC for functional as well as backend perspective to validate campaign related data.

•Experience VLOOKUP (Vertical Lookup) is a function used to search for a value in the first column of a table (range) and return a value in the same row from a specified column. It is one of the most commonly used lookup functions in Excel.

• Experience integrating ADF with other Azure services (Azure Databricks, Azure Functions, Logic Apps, etc.) for complex data workflows.

•Worked with different loading utilities in Teradata and Oracle like Teradata BTEQ, MLOAD, FLOAD, and TPUMP Oracle SQL Loader.

Excellent knowledge on Perl& Unix Experienced working with Excel Pivot and VBA macros for various business scenarios Proficient in leveraging Power BI to transform complex data sets into compelling visual stories that drive business insights and decision-making.

Masters in Computer Science from Sacred Heart University, Connecticut, USA

Data modeling Tools: Erwin Power Designar Project

Google Looker/Google Data Studio: Reports and dashboards

Methodologies: SDLC. JAD Sessions, Ralph Kimball Methodologies

Databases: MS Access, SQL, Oracle, DB2, Teradata, Big Data, NoSQL, MySQL, Azure Data Factory (ADF), Azure Data Explorer (ADE)

BI Tools: Power BI (2.x), Tableau, Python, ADF, Looker

Reporting Tools: BO. OBIEE

Languages: PL/SQL,SQL,HTML

OS: Windows

Other Tools: TOAD, MS-Office suite (Word, Excel, Project and Outlook), BTEQ, Teradata

Environment: Windows (95. 98,2000,NT.XP), UNIX

RDBMS: Oracle 1Og/9i/8i/7.x, MS SQL Server. UDB DB2 9.x. Teradata, MS Access 7.0

Objective

**************@*****.***

+1-475-***-****

Data Analyst



Contact this candidate