Post Job Free
Sign in

Computer Science Java /hadoop Developer

Location:
Denton, TX
Salary:
80000
Posted:
March 31, 2017

Contact this candidate

Resume:

NEHA K

aczlc6@r.postjobfree.com

832-***-****

Hadoop/JAVA Developer

Ms. Neha has almost 2 years of IT experience. She is experienced in all phases of Software Development Life Cycle (SDLC), quality management systems and project life cycle processes. Additionally, Ms. Neha has proven expertise in a wide range of languages and technologies including JAVA, C, C++, Hadoop – MapReduce, PIG, HIVE, HBASE, SQOOP, FLUME, YARN, CSS and JavaScript. Ms. Neha possesses excellent work ethics, is self-motivated, a quick learner and functions exceptionally well in a team oriented environment. Relevant Certifications, Course Work

» Edureka certified Big Data & Hadoop

Developer

» IBM DB2 Certified Academic Associate

» Related Coursework: Probability and Statistics

» C, C++, JAVA

» Data Warehousing and Data Mining

» Data Structures

» Advanced Data Structures

» Computer Networks

» Design and Analysis Algorithm

» Pattern Recognition

» Machine Learning

» Database Management Systems

» Information Retrieval Systems

» Programming Languages: Java (J2SE,

J2EE), C, C++, C#, DATASTRUCTURES.

» Scripting Languages: HTML, XML, Java

script, CSS, JQuery, PHP

» BigData Ecosystems: Hadoop-MapReduce,

PIG, HIVE, HBASE, Sqoop, Oozie, Flume,

Zookeeper.

» Databases: MySQL, Oracle.

» Platforms: Windows, Ubuntu, Linux

» Framework: .Net.

» Packages: MS Word, MS PowerPoint, MS

Excel, MS Project.

Education

» Masters in Computer Science (GPA: 3.6/4.0) Expected May-2017 University of North Texas – Denton – TX

» Bachelors in Computer Science (GPA: 3.7/4.0) August 2011 – May 2015 VNR Vignana Jyothi Institute of Engineering and Technology - India Detailed Experience

Madhavi Engineering Jun’14 – May’15

Hadoop Developer Hyderabad, IN

Environment: Hadoop-MapReduce, PIG, HIVE, HBASE, Sqoop, Oozie, Flume. Responsibilities

» Developed MapReduce programs to parse the raw data, populate tables and store the refined data in tables.

» Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with reference tables and historical metrics.

» Apache Oozie Workflow Scheduler for Hadoop Jobs.

» Managed and reviewed Hadoop log files.

» Worked with HBase, Hbase Architecture and its components.

» Worked with different data loading techniques (FLUME, Sqoop, Live Twitter Data) onto HDFS.

» Worked with Pig and MapReduce, and Pig Latin scripting. Hadoop/JAVA Developer

Madhavi Engineering Jun’13 – May’14

Software Developer Hyderabad, IN

Environment: JSP, Java Script, CSS, Eclipse IDE Plug-in Responsibilities

» Developed the website modules using Java, JSP, Servlets and JavaScript.

» Involved in Designing the Database Schema and writing the complex SQL queries.

» Normalized the tables and maintaining Referential Integrity by using Triggers, Primary Key and Foreign Key concepts.

» Developed PL/SQL, stored procedures and functions to manipulate the database.

» Experience with core Java SE, including Collections API, threads, generics, Multi-Threading, Exception Handling, and JDBC.

» Used JSP's to create dynamic pages for user interaction. Academic Projects

Airline Analysis (PIG and LINUX)

This project deals with Airline datasets covering Airports from all over the world. It uses PIG Latin for retrieving data from the hadoop distributed files and perform calculations on retrieved data on the client side.

LTE S1/X2 Handover Stack Implementation

Developed an application in Java with NS3 simulator to strengthen the mobile signals and also minimize the delay time .This application uses LTE architecture to reduce response time. Recommender Systems

In this project we have designed a Recommender System with JAVA using K-Nearest Neighbor and Collabarative filtering. In this project we calculate the distances between active user and all other users and based on that we recommend the user.

UNT Search Engine using Vector Space Model

This project deals with a Vector Space Model retrieval of pages belonging to UNT domain. These pages are dowloaded by JSOUP java library and converting it to a Document Object. These are then retrieved using the cosine similairty between document and the query.



Contact this candidate