Post Job Free
Sign in

Data Project

Location:
United States
Posted:
May 08, 2018

Contact this candidate

Resume:

Chirag Nareshkumar Patadia

Certified Cloudera, Cloud Platform/Bigdata/Hadoop/Liferay Professional

Phone: 732-***-****

Email: ac5d7u@r.postjobfree.com

Professional Summary

** ***** ** *********e in architecture, design, analysis and implementation of enterprise applications in Core Java, J2EE, Google Cloud Platform, AWS, Microsoft Azure, BigData Hadoop Technologies, Liferay, Spark, with good expertise in Spring, Hibernate.

Good experience in application development and maintenance in Core Java & J2EE technologies.

Expert in Installation/Deployment & Maintenance of production environments on Linux/Solaris servers and post-implementation support to solve the operational problems.

Good expertise in developing and administering applications using MongoDB as database.

Good experience on Hadoop framework, Map-Reduce, HDFS, Solr, Pig, Flume, YARN, Sqoop, Storm, Oozie, Jaql, Hive, HBase, Oracle, MySQL.

Good experience in realization activities (like data migration, server configuration and maintenance, testing and User Acceptance Test (UAT)), to ensure all deliverable are accepted.

Expert in Agile Scrum based Delivery model to deliver Superior Quality Results.

Ability to work under deadlines, worked as an active team member and willingness to accept responsibilities.

Strong organizational skills, excellent communication skills, dedicated teamwork, attention to details, ability to work under pressure to balance competing priorities and meet deadlines.

Cloud Platform Expertise:

Good experience in developing projects using Bigdata technologies on Cloud Platform components (in Google, AWS, Microsoft Azure).

Experienced in installing, configuring, and administrating cluster of Virtual machines and major Hadoop distributions.

Have experience or handling large-scale database like BigTable, DataStore, BigQuery, MongoDB.

Experienced in develop Apache Beam pipelines for data processing, running BigQuery ETLs, Load Peta bytes of data into BigTable.

BigData/Hadoop Expertise:

Cloudera Certified Hadoop Developer for Hadoop ecosystem technologies.

Good experience in developing projects using Bigdata technologies like Oozie, Hadoop, Sqoop, HBase, InfiniDB, MongoDB.

Experienced in installing, configuring, and administrating Hadoop cluster of major Hadoop distributions.

Have experience or handling large-scale database like MongoDB, InfiniDB.

Hands-on experience with "Productionalizing" Hadoop applications (such as administration, configuration management, ring, debugging, and performance tuning).

In depth understanding of Hadoop Architecture and components such as Job/Task Tracker, Name Node, Data Node concepts.

Expert in collecting/analyzing data for different social networks like Facebook, Google+, YouTube and Twitter.

Experience in developing applications for big data analysis and sentiment analysis.

LIFERAY Expertise:

Liferay certified professional Developer – Liferay Portal 6.1 – Certification given by LIFERAY.

Contributed to Liferay / other Open Source forums to supplement Liferay training with relevant solutions to customers.

CIGNEX Certified Liferay Developer (CCLP) (CIGNEX).

Experience with Liferay product suite - Liferay Social Office, Liferay Sync, Alloy UI, Liferay FACES, Liferay IDE & Mobile SDK.

Developed the integration module for Liferay and IBM Cognos BI module for displaying the Cognos report in Liferay.

TECHNICAL SKILLS:

Content Mgmt. System: Liferay Enterprise Edition 6.x; 5.2.x; Liferay Standard Edition 5.2

Prog. Languages : Java, Python, UNIX Shell Script/Bash Scripting, C++, C, SQL, PL/SQL, SQL*PLUS, UML, XML

Frameworks : Spring, Spring Boot, JSF, Velocity, Hibernate, Struts, Liferay Portal Framework

Google Cloud Platform: BigTable, BigQuery, DataFlow, DataProc, Compute Engine, App Enging, Cloud SQL, DataStore, PubSub

Hadoop Ecosystem : HDFS, Map-Reduce, HIVE, HBase, Flume, ZooKeeper, Sqoop, PIG, Pentaho, Sqoop, Oozie, Kafka

Spring Framework : IOC, AOP, DAO, ORM, Spring Web MVC and Spring Batch

JEE & Standard Design Patterns: Adapter, Singleton, Abstract Factory, Chain of Responsibility, MVC, Facade, Business

Delegate, Gang of Four including Factory Method, Session Facade, DAO and Business Object.

Databases : MongoDB, InfiniDB, Oracle 11g/10g/9i, MySQL 5.x/4.x, SQL Server 2008/2012, Alchemy

NoSQL Databases : HBase, MongoDB, Elasticsearch, Apache CouchDB

Web Services : RESTFul, SOAP, UDDI, WSDL, BPEL, WS-Policy, JAX-WS, Axis, JAX-RS, ESB, WS-Addressing

Web/App Servers : Tomcat 4.x/5.x/6.x/7.x, Apache, JBoss

SQL : DDL, DML, DQL, DCL for Table, Sequence, Trigger, Procedure, View, Index with Oracle Database

Build Tools : Maven, ANT, Jenkins, Hudson

Source Control System: GitHub, SVN (Subversion), CVS, Rational Clear Case, VSS, Perforce.

Defect/Bug Tracking : Rational Clear Quest, JIRA, and Bugzilla.

IDE & Reporting Tools: IntelliJ, Eclipse 3.x, MyEclipse, NetBeans 4.0, IDE Reports 0.5.0

Operating systems : Windows 10/8.x/7/Vista/XP, Win 2k Server, UNIX, Linux and Mac OSx

Other Skills : Requirements engineering, UML, Design patterns, code reviews, Test Driven Development

Standards & Trends : Agile Programming, Test Driven Development (TDD), Service Oriented Architecture (SOA).

CERTIFICATION:

Certificate

Year of Passing

Status

CCDH (Cloudera Certified Developer for Apache Hadoop)

2012

Certified

CCLP Beginners (CIGNEX Certified for Liferay Practice)

2012

Certified

CCMDBP (MongoDB Certification)

2012

Certified

Java Certified from Brainbench

2013

Certified

CCLP (Liferay Professional Program)

2013

Certified

MongoDB M101J (MongoDB for Java Developers)

2014

Certified

MongoDB M102 (MongoDB for DBAs)

2014

Certified

EDUCATION:

Bachelor of Science (B.S.) in Computer Science/Applications, Gujarat University

Master of Science (M.S.) in Computer Science/Applications, Gujarat University

Awards/Achievements:

Winner in “Program Debugging” Competition in Communiqué Event at DDIT

Won “Software Programming” Competition in SIRAAJ Event at SKP Institute of Management & Computer Studies.

Collage topper in Masters as well as Bachelors in Computer Science and Applications.

Professional Experience

Valley National Bank – Wayne, NJ Jul 2017 Onwards

Project: Credit Analysis Dashboard

Valley National Bancorp (NYSE:VLY) is a regional bank holding company headquartered in Wayne, New Jersey. Valley National Bank is one of the largest commercial banks headquartered in New Jersey.

The Credit Risk Management group produces various reports like Concentration Balance, Risk Rating, Loan-to-Value (LTV) Commercial Real Estate Index, Industry Classification, Property & Guarantee, and Exceptions. Reports will be generated from multiple source systems, reference data and third-party systems. Users can see these reports in Credit Analysis Dashboard.

Responsibilities:

Interact with business stakeholders to understand the Data needs and its mapping.

Responsible for preparing architect for complete end-to-end solution.

Lead a team Big Data team on Big Data Solution for multiple customers and verticals.

Responsible for designing MongoDb database and collections structure design.

Responsible for building scalable distributed data solutions using Java, MongoDb, Trifacta, PowerBI.

Salesforce Objects data ingestion integration with Cloudera Hadoop.

Used Angular 2.0/3.0/4.0, NodeJS, jQuery and Underscore.js scripts to make responsive UI.

Developed reusable components using Custom Directives in AngularJS.

Managing, scheduling and monitoring Jobs on Microsoft Azure cloud environment.

Developed validations using Validation Form and Reactive Form from Angular 2.

Resource management of Microsoft Azure Cluster including adding/removing cluster nodes for maintenance and capacity needs.

Involved in Data Migration from Traditional Apps to Salesforce Using Data Loader Utility.

Primary contact person for Data Ingestion Framwork and Integrating Trifacta data wrangling tool and PowerBI dashboard.

Environment: Microsoft Azure Cloud Platform, Cloudera Hadoop, Hive, HBase, Spark Framework, Java, Salesforce Objects Integration, Amazon Web Servicves (AWS), JQuery, Angular 5.0/4.0/3.0, Trifacta Data Wrangling tool, PowerBI, Python, HTML 5.0, Jetbrains Intellij Idea, Maven

The Home Depot – Atlanta, GA Oct 2015 to Jun 2017

Project: Supplychain Data Integration and Management

The Home Depot is an American retailer of home improvement and construction products and services. It operates many big-box format stores across the United States (including all 50 U.S. states, the District of Columbia, Puerto Rico, the United States Virgin Islands and Guam), all ten provinces of Canada, as well as Mexico.

The supply chain simulation tool is a large-scale emulation of Sync and CAR ordering and RDC allocation. The sim can fed forecasted sales, inventory parameters, and initial inventory levels for order and inventory forecasting and can fed historical sales, forecast, parameters, and inventory for scenario modeling.

Responsibilities:

Provide mentorship and guidance in operationalizing Hadoop ecosystems as specially Hive to solve business problems and Google Cloud Platform and its products.

Interact with business stakeholders to understand the Data needs, its mapping

Responsible for preparing architect for supply-chain simulation tool.

Lead a team Big Data team on different Big Data Solutions/PoCs for multiple customers and verticals.

Responsible for building scalable distributed data solutions using Spark and Python.

Managing, scheduling and monitoring Jobs on a Hadoop cluster and Google Cloud Platform cluster.

Used Angular 4.0/3.0/2.0, HTML5, CSS3, Bootstrap in designing the UI application.

Designed dynamic and browser compatible pages using HTML5, CSS3, JavaScript and AngularJS.

Created forms to collect and validate data from the user in HTML5 and AngularJS.

Implemented Salesforce Development Cycle covering Sales Cloud and Service Cloud.

Provide consulting to customers in identifying Big Data use cases and then guiding them towards implementation of use cases.

Resource management of HADOOP and Google Cloud Platform Cluster including adding/removing cluster nodes for maintenance and capacity needs

Primary contact person for Oozie job configuration

Environment: Google Cloud Platform App Engine, Google Cloud Dataflow/Beam Pipeline, BigQuery, Google Cloud Storage, Cloud SQL, Angular 4.0/3.0/2.0, Compute Engine, Amazon web services (AWS), Hortonworks Hadoop, HDFS, Hive, Oozie, Sqoop, Spark Framework, Spring Boot, Java, PySpark, Python, JSP, HTML 5.0, MongoDB, Eclipse, Maven

Salesify Inc – Redwood City, CA Jan 2015 to Sep 2015

Project: Techleads Online 2.0 Next-Gen B2B Lead Generation Platform

Salesifytm is a global B2B lead generation company focused on improving the ROI and efficiency of sales and marketing organizations. Salesify service helps turn anonymous web traffic into real leads and then help convert leads to revenue via appointment setting and telesales. Services include building customized B2B contact lists, account profiling, CRM data cleansing and appends, event and whitepaper promotions, lead qualification and appointment setting, and renewal and maintenance sales. The Salesify service delivers high quality, cost effective services and programs, customized to address each client's unique requirements.

Salesify Business Intelligence module ensures customers are focusing their sales and marketing efforts on the right person at the right account at the right time.

Salesify Lead Generation module enables you to generate buzz and syndicate your content to a highly targeted audience.

Salesify Telemarketing Service module generates highly qualified leads and actionable market intelligence.

Salesify Data Quality Service module help companies gain control over their existing B2B customer data with CRM data cleansing and enrichment.

Responsibilities:

Provide mentorship and guidance in operationalizing Hadoop and its ecosystem to solve business problems.

Interact with stakeholders in various Big Data initiatives, understand the Data needs and define Data Services product requirements

Analyze Hadoop cluster and different big data analytic tools including Pig, Hbase database and Sqoop.

Lead a team Big Data CoE team on different Big Data Solutions/PoCs for multiple customers and verticals.

Responsible for building scalable distributed data solutions using Hadoop.

Installed and configured Flume, Pig, Sqoop, HBase on the Hadoop cluster.

Managing, scheduling and monitoring Jobs on a Hadoop cluster.

Worked on installing cluster, commissioning & decommissioning of datanode, namenode recovery, capacity planning, and slots configuration.

Provide consulting to customers in identifying Big Data use cases and then guiding them towards implementation of use cases.

Coordinated with technical team for installation of Hadoop & production deployment of software applications for maintenance.

Developed RESTful Web-Services using Spring-ORM, Hibernate and Jersey to provide services for AngularJS application.

Provided technical assistance for configuration, administration and monitoring of Hadoop clusters.

Resource management of HADOOP Cluster including adding/removing cluster nodes for maintenance and capacity needs

Work with support team on Hadoop performance tuning, configuration, optimization and job processing.

Formulated procedures for planning and execution of system upgrades for all existing Hadoop clusters.

Environment: Java, MongoDB, Map-Reduce, ZooKeeper, Tomcat 7.0, Amazon web services (AWS), Spring, Alteryx ETL, JasperSoft Reports, JSP, HTML 5.0, AngularJS, Eclipse, SVN, Continuous Integration, Jenkins, Maven, Sonar

AON Hewitt – Lincolnshire, IL Sep 2014 to Dec 2014

Project: Liferay Portal

AON Hewitt is among the world’s top global human capital and management consulting firms, providing a complete array of consulting, outsourcing and insurance brokerage services. Their professionals possess extensive knowledge and experience in a variety of fields and help companies of all sizes attract and retain top talent. Project involves complete HR portal solution, which contains content management, tile management, user management.

Responsibilities:

Developed web services for both the integration.

Understanding client requirements and prepare user story documents.

Liferay portal administration and involved in portal development. Site setup in Liferay. JCR setup in Liferay

Worked on SOA Architecture and implemented webservice. JDBC used for data extraction application.

Used Agile method for software development.

Followed JSR-286 specification for portal development in Liferay.

Maintaining different pipeline for development and maintenance.

Creating logging, Auditing, Exception framework for DPS Application using Spring

Complete agile life cycle delivery management.

Portlet, Themes and layout development as per requirement

Successfactor API integration using SOAP.

SAML integration with Liferay for SSO.

Developed the module for fetching the Requisition data from the Taleo and displaying it on Liferay.

Written shell scripts. Stress testing using RPT.

Done Stress testing using Rational application Tester Tool

Profiling/configuration on Liferay Portal server

Developed AMCharts for displaying the IBM Cognos BI Reports in Liferay.

Environment: Liferay Portal 6.1.10 EE sp2, Successfactor API, Plugin Development Environment, Velocity, AM Charts, JSP, HTML, Eclipse, SVN, Java, HTML 5.0, Maven, Apache Server, Sonar, Hudson, Mysql, Liferay Tomcat 7.0, Java Script, JQuery, SAML, CSS, Windows and Linux.

American BigData (AM BigData) INC (www.ambigdata.com) – Atlanta, GA Feb 2014 to Sep 2014

Project: BIGDATA Liferay Integrated Solution

The product collects process and performs analysis on social media sites data. The product looks for all possible social sites (like Facebook, Twitter, GooglePlus, Youtube, Instagram) data for different users and keywords to be monitored. User can see sentiment analysis and trending for different keywords, which will be populated, based on data collected from different social media sources. Registered user can see sentiment analysis and trending for different keywords, which will be populated, based on data collected.

Responsibilities:

Developed modules for fetching the data from different social media like Facebook, Twitter, Google+ and Youtube.

Used Datashift API for getting the historical data from the Social Medias.

Used AlchemyAPI for sentiment analysis for the different Social Medias.

Involved in defining the schema for the MongoDB collection for storing the social data.

Understanding client requirements and prepare user story documents

Mule Scheduler and webservice code development.

Created different workflows in Mule for the social data collection and processing.

Developed Map Reduce Functionality for the IT log analysis.

Involved in InfinyDB database designing.

Sqoop integration with hadoop.

Developed Oozie workflows to integrate it with Hadoop system.

Environment: Java, Hadoop, Oozie, Sqoop, MongoDB, Keyword Monitoring from Social API (for Facebook, Twitter, GooglePlus, Youtube, Instagram), JSP, HTML 5.0, HDFS, Map-reduce, ZooKeeper, Eclipse, SVN, Sonar, Hudson

GUAVUS INC – San Mateo, CA Aug 2013 to Feb 2014

Project: BIGDATA Telecom Data Solution/Integration

Telecom Operators are seeking more visibility to better run their policy system as well as understand policy impact on subscribers, services and network. PCRF product enables service providers to manage and monetize mobile data and evolve to LTE. By combining BigData technologies with client's policy solution the resulting insight can provide visibility into policy usage, and tier change and system performance. This wealth of information can benefit operations, engineering, planning or marketing teams. Providing such visibility in meaningful manner can generate up-sell opportunities to existing customers.

Responsibilities:

Reviewed and documented data requirements, workflows and logical procedures.

Maintained application performance levels by coding and testing phases.

Designed and developed applications utilizing Java and J2EE technologies.

Understanding client requirements and prepare user story documents.

Designing based on the user story/requirement document for Hadoop MapReduce MRv1, Oozie workflows.

Map-Reduce Job Performance tuning development.

Single Click Installation ISO development for BigData Cloudera Hadoop cluster.

Installation, configuration of the Overall system till production environment.

Setting up and configuration of Hadoop cluster along withother frameworks like Flume, Sqoop and MongoDB.

Supported legacy application integration and conducted peer reviews.

Environment: Java, Hadoop HDFS, MapReduce, Java Servlet, MySql, InfiniDB, Oozie, Sqoop, ANT, JSP, HTML 5.0, Flume, ZooKeeper, Eclipse, SVN, Sonar, Hudson

Saudi Telecom Company (STC) – Riyadh, Saudi Arabia Feb 2013 to Aug 2013

Project: STC Liferay Portal

Developed Liferay Intranet Portal which is used in-house internally by the STC. It is single point of site for employee activities like leave application, votes, feeds. In this portal we have integrated other 3rd party applications. Portal is based on Liferay. Migrate existing J2EE portal into Liferay (which is Open Source) and use of CMS which is easily manageable with minimal configuration.

Responsibilities:

Worked on Onsite from Riyadh, Saudi Arabia

Installation, configuration of the Overall system till production environment.

Developed email newsletter functionality for the Weekly Dose as well as Sales and Marketing Newsletters

Developed hiring portlet for the new hire employees to integrate it with SAP.

Understanding client requirements and prepare user story documents

Utilized Liferay Enterprise Portal for development of apps.

Liferay theme designing.

Developed program and grant commission portlet for funding approval functionality.

Developed Custom and Scheduled report tool for workers.

Developed the various schedulers for the various monitoring functions in Liferay.

Configure load balancer for Liferay

Customization of the different out of the box portlets like Asset Publisher, Document, Media display etc in Liferay

Integrated LDAP with Liferay.

Involved in application deployment and performance tuning.

Environment: Java, Liferay 6.1, ANT, JSP, HTML 5.0, Eclipse, SVN, Maven, Apache Server, Sonar, Hudson

InterContinental Hotels (IHG) Group – Buckinghamshire, UK Jul 2012 to Feb 2013

Project: IHG Liferay Portal

Project involves security and permissions to be implemented in the Portal as per custom permission algorithm over Liferay's permission algorithm. It involves WFM (Work Flow Management System) for activity module. Security and permissions were implemented in the Portal as per custom permission algorithm over Liferay's permission algorithm.

Responsibilities:

Completely involved in requirement gathering and whole project understanding

Understanding client requirements for permission algorithm as per client policies.

Leading and handle development team for security, permissions module and also provided best migration solution within timeframe.

Setting up of the Environment using plugin SDK and Eclipse IDE in Liferay 6.0.5.

Installation of Tomcat + Liferay 6.0.5.

Plugin Ext Environment, Hook Environment, Plugin Environment Creation in Liferay 6.0.5.

Provide support till LIVE of the project and without any bugs and re-work.

Inter Portlet Communication using events

Theme Creation/modifications

Customization of the different out of the box portlets like Asset Publisher, Document, Media display etc in Liferay

Environment: Liferay Portal 6.1.20 EE, Plugin Development Environment, Velocity, Mysql, Liferay Tomcat 7.0, Java Script, JQuery, AJAX, ANT,HTML,CSS, JSTL, Eclipse, Tortoise SVN, Windows and Linux.

BASF Chemical – Germany Nov 2011 to Jun 2012

Project: Liferay and BigData Hadoop Integration

This project involves BigData integration done for integrating Liferay's Document Library with HBase and HDFS. Documents uploaded in Liferay's Document Library portlet will get stored in HBase and HDFS instead of local disk. BigData integration done for integrating Liferay's Document Library with MongoDB. Documents uploaded in Liferay's Document Library portlet will get stored in MongoDB instead of local disk.

Responsibilities:

Liferay 6.0 HDFS extension development and maintenance.

Liferay clustering for Unisys

Check all possible queries to search documents in Apache Solr indexing tool.

nodejs – node-HBase RnD to directly upload files in HBase

MongoDB setup and administration, MongoDB database schema design

MongoDB query optimization to fetch stored documents

Write Query, Development, Debugging, Setup and work on defined Tasks on Apache Solr & Liferay.

Handled development for Solr query execution issue which was found during various types of query analysis on Solr.

Creating jersey web service to call HBase Solr API.

Creating generic schema API for HBase & Solr.

Environment: Liferay Portal 6.0 Ext Env, Liferay Portal 6.0 Plugin Dev Env, Hadoop Technologies, HBase, HDFS, MongoDB, Apache Solr, Velocity, Mysql, Liferay Tomcat 6.0, ANT, SVN, Maven, Hudson

Elitecore Technologies – Jakarta, Indonesia Sep 2007 to Nov 2011

Position: Senior Software Engineer

Project: OSS/BSS Telecom Solution, For – IndosatM2, Jakarta (Indonesia)

This is a data service billing solution for prepaid, postpaid and hybrid business models. Main challenge in to this project is rating and charging engine where subscriber may use server from Dial-up, Network, CDMA device. Wi-Fi Zone or 3.5G network. Network access from every device will be charged at different rate card measured in different units.

Developed a prepaid data service - a billing solution which includes product management, Subscriber Management, Service Provisioning, Recharge and On Demand Service Activation. i.e. BOD (Bandwidth on Demand), VoIP, IPTV, etc

Responsibilities:

Worked on Onsite from Jakarta, Indonesia

Primary point of contact for client interactions & discussions/requirements/queries

Helped Project Manager in Project planning activities. Guiding & supporting technical team in installation and delivery of project at client’s location

Worked at the client’ site for Go-Live, User Acceptance Test (UAT), Migration, Parallel Run, site maintenance & client support activities.

Carried complete ownership for complete Prepaid Solution

Design and development for some of project's important functionalities like Subscriber Creation, Quota Management for subscriber, billing process, access rights management, etc.

Understanding of client’s existing system database and mapping performed between existing system and new system. Development of all migration scripts. Put all scripts into the oracle scheduler for automatic execution.

UAT for Server Switch over change request. Stand by servers has been made as live in this UAT.

To do support actions, like to solve issue from the live site (Client Side), to solve issues, which have risen by Implementation Team members, Support Team members, and QA persons.

Find out RCA for that issue and then provide proper permanent solution of those raised issues. Main responsibility is making all functional flow stable and make product Qualitative.

Implementation of all external system integration.

Carried complete ownership of version build and installation for client requirement solutions.

Environment: Java, ANT, JSP, EJB, JMS, Oracle10g, JBoss3.2.6, Eclipse, SVN, Maven, Apache Server, Sonar, Hudson, SOAP

Web@Ease Solutions Pvt. Ltd – Gujarat, India May 2004 to Sep 2007

Position: Software Developer

Project: E-Commerce Portal

E-commerce portal for ladies beauty item.

Responsibilities:

Worked on HLD(High level design) and LLD(Low level design which is detail design) document

Designed and developed a new portal used for integrating data from multiple diverse companies.

Worked on SOA Architecture and implemented web service. JDBC used for data extraction application.

Developed Presentation Layer of the application using JSP/JSF and JavaScript to display information

Re-wrote an existing portal application to follow MVC design pattern.

Developed the JMS message processing module to send & receive JMS Messages.

Used Agile method for software development.

Maintaining different pipeline for development and maintenance.

Understanding of client’s existing system database and mapping performed between existing system and new system. Development of all migration scripts. Put all scripts into the oracle scheduler for automatic execution

Environment: Apache Tomcat 6.0, Mysql, Java Script, JQuery, AJAX, ANT, JSP, HTML, CSS, JSTL, WebServices, Servlets, JDBC.



Contact this candidate