Post Job Free
Sign in

Senior Business Data Analyst

Location:
Financial District, MA, 02109
Posted:
February 20, 2025

Contact this candidate

Resume:

Venkata Ramana Bellamkonda

+1-860-***-**** ***********@*****.*** LinkndIN

Senior Business Analyst

PROFESSIONAL SUMMARY:

Senior Business Data Analyst with 9 years of experience, working with stakeholders from the analytics, IT, and business teams to collect requirements, establish KPIs, and provide Splunk development for data-driven insights

and strategic decision-making.

Reviewed, analyzed, and evaluated business systems and user needs to align with overall business strategies, ensuring system functionalities met operational goals.

Documented and gathered detailed business and technical requirements for the IRIS 2.0 project, a custom case management system for the Division of Vocational Rehabilitation, ensuring alignment between business needs and system capabilities.

Expertise in SWIFT MT/MX message formats, ISO20022 migration, and payment message conversion, ensuring compliance with financial industry standards.

Hands-on experience with Fedwire and CHIPS for high-value and real-time gross settlement (RTGS) payments, integrating with banking systems to optimize financial transactions.

Led Agile projects, managing requirements through sprints and collaborating with stakeholders to prioritize deliverables, ensuring timely and efficient data solutions while maintaining high-quality standards.

Conducted comprehensive requirements analysis by engaging with business units to define KPIs and develop clear, actionable data requirements that supported decision-making.

Utilized Agile methodologies to iterate and refine business requirements throughout the project lifecycle, ensuring constant alignment with evolving needs.

Documented functional and technical specifications, including Business Requirements Documents (BRD) and Functional Requirements Documents (FRD), ensuring all business and IT stakeholders had a clear understanding of system capabilities.

Developed user stories and acceptance criteria for developers, enabling efficient and accurate system development while ensuring the technical requirements aligned with business objectives.

Created user test plans and facilitated user acceptance testing (UAT), gathering feedback from end-users to validate system functionality and ensure final product alignment with business needs.

Collaborated with cross-functional teams to define and implement business rules, data workflows, and reporting requirements, ensuring real-time data insights for case management.

Applied business process reengineering techniques to identify inefficiencies and recommend system improvements, optimizing workflows for better efficiency.

Assisted in the deployment and management of reporting tools and platforms, supporting business intelligence efforts through dashboards and automated reports.

Prepared solution options, conducted risk identification, and performed financial analyses such as cost/benefit assessments and ROI calculations to support decision-making.

Applied change management techniques to ensure smooth transitions during system upgrades, documenting change processes and guiding stakeholders through technology implementation.

Utilized MS Excel to analyze system data and track key project metrics, ensuring project milestones were met and providing detailed performance reports for stakeholders.

Ensured compliance with AML, KYC, ISO20022, and SWIFT financial regulations, maintaining adherence to regulatory requirements.

Automated payment reconciliation workflows using ETL tools like Alteryx and Informatica, improving operational efficiency.

Created detailed technical documentation outlining data processes, workflows, and system configurations to ensure maintainability and user training.

Developed data integration strategies using AWS Redshift, S3, and Glue, automating data pipelines for seamless data ingestion, transformation, and reporting.

Led the development of automated data solutions using AWS Lambda and Glue, streamlining data processing and integration for reporting and case management insights.

Facilitated stakeholder meetings and workshops to define business processes, system requirements, and identify potential inefficiencies, ensuring optimal system implementation.

Utilized Azure Data Factory (ADF) to automate data ingestion and transformation workflows, enhancing real-time reporting capabilities.

Designed Power BI and Tableau dashboards for real-time operational insights, tracking critical KPIs related to case management for better decision-making.

Worked closely with business users and technical teams to ensure data accuracy, governance, and quality, leveraging tools like Informatica and Trifacta for data cleansing and profiling.

Deployed machine learning models using AWS SageMaker to predict case outcomes and optimize service delivery, enhancing operational efficiency.

Collaborated with IT teams to integrate data solutions, ensuring seamless transfer of case management data across different systems via APIs and automated data pipelines.

Collaborated with IT teams to integrate data solutions, ensuring the seamless transfer of case management data across different systems using APIs and data pipelines.

TECHNICAL SKILLS:

Data Analysis & Business Intelligence

Tableau, Power BI, SAP Business Objects, SSRS, Looker, Excel, R, Python (Pandas, NumPy, SciPy, Stats Models, Scikit-learn), Trend & Predictive Analytics

Databases & Data Management

SQL Server, MySQL, PostgreSQL, Oracle, AWS Redshift, Snowflake, Google BigQuery, MongoDB, Cassandra, Teradata, Data Modeling & Warehousing

ETL & Data Integration

SSIS, Informatica, Alteryx, Talend, Apache NiFi, Apache Airflow, Pentaho, AWS Glue, Azure Data Factory (ADF), Data Pipeline Automation

Cloud & Big Data Technologies

AWS (Redshift, S3, Glue, Lambda, Data Pipeline, SageMaker, Rekognition, EMR), Azure (ADF, Synapse, ML, Blob Storage), Google Cloud, Apache Spark, Hadoop, Hive, Kafka, Flink, Presto

Machine Learning & AI

Scikit-learn, TensorFlow, Keras, PyTorch, XGBoost, LightGBM, CatBoost, AWS SageMaker, Azure ML, MLflow, Feature Engineering & Model Deployment

Logistics & Supply Chain Management

Transportation Management Systems (TMS),Warehouse Management Systems (WMS),ERP Systems (SAP, Oracle), Real-Time Shipment Tracking,International Trade Compliance (Incoterms, HS Codes)

Data Governance & Quality

Informatica Data Quality, Collibra, Trifacta, Open Refine, Data Profiling & Cleansing, Compliance (GDPR, HIPAA, SOX)

Programming & Workflow Automation

Python, R, SQL, VBA, Bash, Julia, Scala, Git (GitHub, GitLab, Bitbucket), Jenkins, Alteryx, Automation Scripting

Project Management & Documentation

Agile (Scrum, Kanban), Waterfall, BRD, FRD, Visio, Lucidchart, Confluence, Jira, SharePoint

PROFESSIONAL EXPERIENCE:

Client: Cigna Healthcare, Houston, TX Jan 2024 to till date

Role: senior Business/Data Analyst

Roles & Responsibilities:

Analyzed complex datasets using SQL and Excel to uncover business trends, optimize healthcare policy decisions, and support data-driven strategies.

Developed and optimized data pipelines in Azure Data Factory (ADF), automating data ingestion, transformation, and orchestration for enterprise-wide analytics.

Optimized high-value payment processing using Fedwire and CHIPS networks for seamless transactions.

Designed and maintained data warehousing using Azure Synapse Analytics, ensuring high-performance data storage and retrieval for advanced analytics.

Created interactive dashboards in Power BI, visualizing key performance indicators (KPIs) to support business intelligence and operational efficiency.

Implemented predictive models using Azure ML, leveraging trend and forecasting analytics to enhance customer behavior insights and optimize service offerings.

Designed ETL workflows using SSIS, ensuring seamless data integration and improving data pipeline automation.

Integrated Confluence and SharePoint into project workflows, enhancing collaboration, version control, and documentation management across teams.

Performed detailed SWOT and GAP analyses, identifying business inefficiencies, risks, and opportunities, enabling strategic process improvements.

Developed and maintained SQL-based data models to support real-time business intelligence, enhancing data accessibility and query performance.

Ensured data governance and compliance using Informatica Data Quality, aligning data management with GDPR, HIPAA, and SOX standards.

Leveraged Azure Blob Storage for secure and scalable data storage, improving data accessibility and cost efficiency for analytical operations.

Applied data mining and statistical analysis techniques using R (ggplot2, dplyr) to identify business trends and operational insights.

Enhanced CRM data integration with Adobe Analytics, providing deeper insights into patient engagement and consumer behavior.

Conducted financial data analysis using Experian, supporting risk assessments, underwriting, and fraud detection initiatives.

Documented analytical processes using Git, ensuring audit compliance, version control, and process standardization for data handling activities.

Created detailed process flows using Visio, mapping data pipelines and business operations for streamlined workflows.

Led Azure-based advanced analytics projects, applying Azure Machine Learning and Synapse Analytics to drive predictive insights for healthcare optimizations.

Environment:

SQL, Excel, Azure (ADF, Synapse Analytics, ML, Blob Storage, Data Lake, Databricks, Purview), AWS (Redshift, Glue, Lambda, SageMaker, S3), Power BI, SSIS, Informatica (PowerCenter, Data Quality, MDM), Snowflake, Collibra, Confluence, SharePoint, Adobe Analytics, Experian, Git (GitHub, Bitbucket), Visio, Lucidchart, SAP BusinessObjects, Alteryx, Talend, Apachespuark, Hadoop (Hive, HDFS), JIRA, ServiceNow, Trifacta, Looker, SAS, BigQuery, Python (Pandas, NumPy, Scikit-learn, TensorFlow), R (ggplot2, dplyr, caret), Fedwire, CHIPS, SWIFT, ISO20022, AML, PCI DSS, KYC Compliance.

Client: BNY, New York May 2022 to Dec 2023

Role: Business/data Analyst

Roles & Responsibilities:

Developed complex SQL queries and optimized stored procedures to improve data retrieval and financial reporting performance, ensuring high availability of banking data.

Implemented data integration workflows using Talend, streamlining financial data ingestion, validation, and transformation processes across multiple banking platforms.

Designed and maintained financial data models in PostgreSQL, improving data structuring, indexing, and transactional efficiency for large-scale banking operations.

Developed dynamic reports using Looker, enabling interactive financial analysis, risk monitoring, and real-time investment tracking for senior executives.

Automated reconciliation processes using Alteryx, reducing manual efforts in transaction validation, fraud detection, and regulatory reporting.

Conducted advanced statistical modeling in Stata and MATLAB, applying Monte Carlo simulations and time-series forecasting for investment risk assessments.

Implemented streaming analytics using Apache Kafka, enabling real-time transaction processing and fraud monitoring across global banking networks.

Integrated customer sentiment analysis using NLP models in Python, analyzing unstructured text data from banking surveys and complaints to improve customer experience strategies.

Developed financial stress-testing models in SAS, ensuring regulatory compliance with Basel III and stress testing for capital adequacy.

Enhanced loan portfolio risk assessment by utilizing Oracle Financial Services Analytical Applications (OFSAA), ensuring improved credit risk evaluations.

Leveraged Microsoft Purview for data governance, implementing automated classification and lineage tracking for financial compliance and audit readiness.

Developed IFRS 9 and Basel III regulatory reports using SAP Business Objects, ensuring timely submission of compliance reports to banking regulators.

Designed scalable financial reporting structures in IBM Cognos, facilitating data-driven decision-making for investment risk and capital management.

Executed real-time risk modeling using Azure Databricks, improving market risk predictions and stress testing for asset-liability management.

Implemented data encryption strategies using IBM Guardium, securing sensitive banking data and meeting regulatory security standards.

Documented analytical frameworks, compliance workflows, and financial risk methodologies in Confluence and SharePoint, ensuring auditability and policy adherence.

Environment: SQL, Azure (Databricks, Purview, ML, Blob Storage), Looker, PostgreSQL, Talend, Alteryx, Stata, MATLAB, SAS, Apache Kafka, SAP Business Objects, IBM Cognos, Microsoft Purview, Oracle Financial Services Analytical Applications (OFSAA), IBM Guardium, Confluence, SharePoint.

Client: State of North Dakota, Bismarck ND Feb 2021 to Apr 2022

Role: Business/Data Analyst

Roles & Responsibilities:

Analyzed large transportation datasets using SQL to identify traffic patterns, infrastructure needs, and road safety risks, optimizing state-wide transportation planning.

Developed and maintained AWS-based data pipelines using AWS Glue to ensure seamless integration of real-time transportation and traffic data from multiple sources.

Built interactive dashboards in AWS QuickSight to provide real-time visualization of traffic congestion, vehicle movements, and infrastructure utilization for transportation authorities.

Designed and optimized relational databases in AWS Redshift to enhance query performance for traffic analysis, public transit planning, and logistics tracking.

Implemented machine learning models using AWS SageMaker to predict accident hotspots and optimize route planning based on historical traffic data.

Performed ETL processes using AWS Lambda and AWS Glue, automating data extraction, transformation, and loading for large-scale transportation datasets.

Conducted geospatial analysis using Python (GeoPandas, Folium) to map road conditions, accident densities, and traffic congestion trends.

Integrated Salesforce CRM with AWS Redshift to improve data accessibility for customer service teams managing public transit inquiries and complaints.

Utilized Amazon S3 for centralized data storage, ensuring secure and scalable management of transportation data, vehicle tracking, and infrastructure reports.

Implemented predictive analytics using Python (Scikit-learn, Pandas) to forecast seasonal traffic fluctuations and infrastructure maintenance needs.

Performed data quality checks using AWS Data Wrangler to ensure clean and standardized datasets for accurate transportation reporting.

Created automated reports using Python and Boto3 to streamline real-time transportation monitoring and emergency response planning.

Developed Tableau-based executive reports summarizing road maintenance costs, traffic incident analysis, and budget allocation trends for state officials.

Leveraged AWS IAM roles and policies to enhance data security, access control, and compliance with transportation data regulations.

Managed documentation and data governance using Confluence and SharePoint to ensure data accuracy, regulatory compliance, and streamlined workflows.

Environment: SQL, AWS (Redshift, S3, Glue, QuickSight, Lambda, SageMaker, IAM, Data Wrangler), Python (Pandas, NumPy, Scikit-learn, GeoPandas, Folium), Salesforce, Tableau, Confluence, SharePoint

Client: Cyber Infrastructure (P) Ltd, Indore, India Jun 2018 to Nov 2020

Role: Junior Data Analyst

Roles& Responsibilities:

Managed SQL Server and Oracle databases, optimizing query performance and ensuring data integrity for high-volume business operations.

Developed interactive dashboards in Power BI and SAP Business Objects, providing real-time insights to stakeholders for data-driven decision-making.

Automated ETL workflows using Informatica, streamlining data extraction, transformation, and loading processes across multiple business units.

Utilized Apache Kafka for real-time data streaming, enabling immediate availability of transactional and operational data for analysis.

Implemented data security measures using IBM Guardium, ensuring compliance with regulatory standards and protecting sensitive enterprise data.

Conducted statistical data analysis using Python (Pandas, Matplotlib, Seaborn), supporting predictive analytics and business trend identification.

Managed project documentation and version control using Git and JIRA, ensuring collaboration, workflow tracking, and efficient data governance.

Optimized database performance using SQL Tuning Advisor, improving query execution speed and enhancing overall data processing efficiency.

Environment: SQL Server, Oracle, Power BI, SAP Business Objects, Informatica, Apache Kafka, IBM Guardium, Python, JIRA, Git, SQL Tuning Advisor.

Client: Sagar Informatics Pvt. Ltd, New Delhi, India Aug 2016 to May 2018

Role: SQL Developer

Roles & Responsibilities:

•Developed complex SQL queries and stored procedures in Microsoft SQL Server, optimizing data retrieval, indexing, and query execution for high-performance applications.

•Designed and maintained SSRS-based reporting solutions, delivering real-time business insights and automated reporting for decision-making.

•Automated ETL workflows using SSIS, streamlining data integration, migration, and transformation across multiple data sources.

•Conducted data validation and quality assurance using SQL Data Quality Services, ensuring high accuracy and consistency in business data.

•Optimized database performance through query tuning, indexing strategies, and partitioning, reducing query execution time and improving system efficiency.

Environment: Microsoft SQL Server, SSRS, SSIS, SQL Data Quality Services, Excel (VBA).

Education:

Bachelor of Technology (B. Tech) in Computer Science from K L University, Guntur, Andhra Pradesh India. – 2016



Contact this candidate