Job Openings at Openwave
We understand your needs; We recognize your motivations; We challenge your intellect with stimulating projects; We match the best in compensation; We reward your efforts. Come, find out what we can offer each other and take up a position that could make you a leader tomorrow.
Job Title: Sr. Computer Systems Analyst
Requirement Number: OWC-SCSA-DK1
- Analyze science, engineering, business, and other data processing problems (big data) to implement and improve performance of the System.
- Analyze and design high available web solution using Apache Web Server, and replication mechanisms. Use continuous integration methodology to setup, deploy and build environment.
- Analyze or recommend commercially available software by evaluating features based on usability and performance metrics.
- Create test scenarios in the User Acceptance Testing (UAT), Systems Interface Test (SIT) environments and Endurance Testing.
- Coordinate with developers, QA, Product owner and others according to specific business requirements.
- Utilize SQL Database (Oracle, MySQL, MSSQL Server), No SQL database (MongoDB, Cassandra) and Hadoop system to design and develop software and handle big data processing and storage requirement. Use Hive, Pig and Map reduce tools to analyze collected data.
- Use programming languages including Core Java, J2EE, Multi threading APIs, Spring, Hibernate, AWS APIs, REST/SOAP Services, Sprint Boot, Oracle PL/SQL, NoSQL (MongoDB) and MySQL to design and develop solutions.
EDUCATION AND EXPERIENCE REQUIREMENT:
Requires a Bachelor’s degree in Computer Science, Computer Information Systems, Engineering or Related Field and 1 year of experience in job offered or 1 year of experience in the Related Occupation. Will accept any combination of degrees, certificates, training and/or experience, evaluated as equivalent to a US Bachelor’s degree by a credential evaluator.
Sr. Technology Leader or any other job title performing the following:
- Defining infrastructure for design and integration of internet computing systems by analyzing information requirements; determining platform architecture, technology, and tools; studying business operations and user-interface requirements. Responsible for BIG data POC using Hadoop, Hive , Pig ,Mapreduce and Mondo . Storing tarabytes data in Hadoop cluster.
- Using profiling tools to improve system performance incorporating the most complex of technical development concepts, latest software tools and technologies, strong database concepts and designing techniques.
- Improving architecture by tracking emerging technologies; evaluating the applicability to business goals and operational requirements.
- Participating in all quality and SDLC related processes (RTM, SFS ,SRS, UTP, UTR, Code Review).
- Acting as technology mentor for development and Quality Management to ensure proper product design and development.
- Set up Hadoop cluster to improve the performance of the reporting module and support analytical reporting. Hadoop cluster consists of 1 name node and 10 data node. Preparing scripts in Sqoop to transfer the data in Hadoop HDFS at periodic intervals. Preparing schema in Hive to store the data and using Hive scripts to retrieve data from HDFS. Developing customized ETL Layer in MySQL to store the aggregated data up to 1 year.
- Developing modules using Core Java , J2EE,Executor framework , Apache Camel , EJB 3.0 ,Apache Active MQ and JBOSS application server .Developing communication layer between the different components of the system using socket programming, using scalable Adapter to collect 100K packets per second.
Job Time: Full Time
City: New York
State: New York or any other unanticipated locations/worksites throughout the US
Mail resumes to: email@example.com
Openwave Computing LLC
1220 Broadway, Suite # 703 NY, NY 10001
Consistent with the Openwave Employee Referral program, Openwave internal employees are encouraged to refer suitable candidates for the above open position, you will be entitled to $ 1000 for every successful referral.