Job Archives
*Qualifications*
Bachelor’s degree in engineering, supply chain management, or other relevant work experience
Master’s Degree In Industrial Engineering, Supply Chain, Operations Research, Computer Science, Computer Engineering, Statistics Or a Related * Post graduate certifications like APICS-CPIM, CSCP or similar supply chain certifications a plus
Proven experience in supply chain management or with deep functional knowledge in at least one area of supply chain planning: i.e. operations, demand planning, supply planning, inventory optimization, logistics optimization, or supplier network optimization
Experience leading supply chain teams, training or projects; full project management experience preferred
Kinaxis Author Level-2 + certifications *
Job Features
Job Category | Solution Architect |
Contract
remote
Responsibilities:
1. Lead the development and implementation of advanced analytics models and solutions to extract value from data.
2. Utilize machine learning, deep learning, and other data science techniques to design, develop and optimize algorithms that drive data insights and product performance.
3. Leverage computer vision and NLP to enhance our product's ability to interpret and understand visual and textual data.
4. Utilize Spark, Databricks, Tensorflow, and Python to handle data extraction, analysis, and model development.
5. Collaborate with cross-functional teams to understand business needs, identify data-driven solutions, and translate complex findings into actionable strategies.
6. Mentor and guide junior data scientists, fostering a culture of innovation and excellence in data science.
7. Stay updated with the latest industry trends and advancements in data science, ensuring our company remains at the forefront of technological innovation.
Qualifications:
1. Bachelor's degree in Computer Science, Statistics, Mathematics, or a related field. Advanced degree (Master's or Ph.D.) is preferred.
2. Minimum of 3+ years of experience in data science with a focus on machine learning, deep learning, computer vision, NLP, and Spark.
3. Proficiency in Python development, Tensorflow, and Databricks.
4. Proven experience working in a high growth start up ideally within the B2B SaaS space.
5. Strong understanding of statistical analysis, data mining, and predictive modeling.
6. Excellent problem-solving skills with the ability to think critically and strategically.
7. Strong communication skills with the ability to effectively present complex data to non-technical stakeholders.
Job Features
Job Category | Developer |
Sravz is looking for Collibra remote consultant. If interested please forward your resume.
Job Features
Job Category | Developer |
Office Administrator to work onsite in Bridgewater NJ
Role: Netsuite Developer
Location: Remote role
Job Type: Contract
Description:
Requirement:
Customize NetSuite ERP to meet the unique needs and requirements of the
organization, including workflow automation, scripting, and record customization
Integrate NetSuite with other systems, applications, and third-party platforms to ensure seamless data flow and synchronization
Provide ongoing support and administration for NetSuite users, troubleshoot issues,
and proactively address system maintenance and optimization
Develop and maintain scripts (SuiteScript) to automate tasks, enhance functionality,
and create custom reports
Create and maintain detailed documentation of customizations, configurations, and
workflows
Collaborate with our Security and Compliance teams to ensure change management
protocols are adhered to for financially-relevant systems
Manage data imports, exports, and data migrations within NetSuite
Create and review system-generated reports for completeness and accuracy
Collaborate with cross-functional teams to identify areas for process improvement
and design solutions within NetSuite
Conduct training sessions for end-users to ensure they effectively use NetSuite
features and tools
Stay up-to-date with NetSuite updates, patches, and best practices to ensure the
platform's compliance and security
Any other tasks that may be assigned to help the company meet its goals
What You’ll Need to Have:
Bachelor’s degree in the field of Information Technology, Computer Science and/or
relevant industry certification and experience
Proven experience in NetSuite development, including SuiteScript (Javascript),
SuiteFlow, and SuiteTalk
Working knowledge of SuiteCloud Development Framework, SQL, ODBC/JDBC, REST and SOAP Web Services, HTML and CSS, JSON and XML
Working knowledge of GitHub
Strong understanding of business processes and the ability to translate business
requirements into technical solutions
Familiarity with NetSuite SuiteCloud development toolsExperience with data integration and data migration
Experience using support ticket management systems such as Zendesk or JIRA
Ser
vice Desk
Excellent customer service, problem-solving and analytical skills
Ability to work well under tight deadlines, respond to rapidly changing demands, and
fulfill efficient follow-ups
Strong sense of ownership, critical thinking, and urgency with great attention to
details
Excellent oral and written communication skills
Preferred Qualifications:
3-5 years as a Netsuite developer
3-5 years experience working within an IT team in a hyper-growth environment or startup
Experience with Advanced Revenue Management (ARM) and Subscription Billing
NetSuite certification is a plus
2nd requirement - Sr Data Scientist
Skill: SAS, Python, Pyspark, Data bricks, Machine Learning and Strong/ recent Healthcare experience
REMOTE
Long term contract
Job Features
Job Category | Developer |
Position: TIBCO Admin
Location: Remote - USA
Type of Employment: Long Term Contract
Upgrade the TIBCO Platform to version 6.10
Req:
Version Upgrades: Lead and execute TIBCO product upgrades, including TIBCO BusinessWorks (BW), TIBCO Enterprise Messaging Service (EMS), and other components. Collaborate with cross-functional teams to plan, test, and implement upgrades.
Compatibility Assessment: Evaluate the impact of upgrades on existing applications, interfaces, and integrations. Ensure backward compatibility and address any issues arising from version changes.
Hands-On: Perform hands-on coding with TIBCO Active Matrix (AMX) BW, TIBCO Enterprise Message Service (EMS), TIBCO HAWK as needed for Proof-of-concept validations and in guiding other team members
Design: Work with Solution Architects and Business Unit tech leads to analyze requirements and define the high-level design
Performance Testing: Conduct thorough performance testing after upgrades to validate system stability, scalability, and responsiveness.
Documentation and Training: Create detailed documentation for upgrade processes, best practices, and troubleshooting guidelines. Define and own strategy for SW deployment, work on automation of deployments. Train team members on new features and functionalities.
Monitoring and Support: Monitor post-upgrade environments, address any issues promptly, and provide ongoing support to users and development teams.
Security and Compliance: Implement security patches and ensure compliance with organizational policies during upgrades.
Qualifications:
Education: A relevant degree (Bachelor’s or Master’s) in Computer Science, Information Technology, or related fields.
Certifications: Have Excellent technological skills (e.g., TIBCO BW, TIBCO EMS). Certifications are advantageous.
Qualities:
Job Features
Job Category | Administrator |
Job Title: Senior Big Data Developer
Location: Remote
Duration: Contract
Job Description
Position: Senior Big Data Developer
Location: Remote, USA
Type of Employment: Long Term Contract
Purpose of the Position:
To provide Big Data Engineering capacity to the customer. This position is critical for smooth onshore / offshore coordination and client interactions.The role involves providing hands on technical development and implementing best practices. Commitment to understand business data Insight and will be responsible for optimizing data storage. Demonstrated ability to handle multiple tasks and work with business users directly.
Key Result Areas and Activities:
Understand the media domain and come up with new insights
Data analysis is performed to work with business and BA to close on requirements and debug any UAT / prod issues
Answerable for AWS costs and services like s3, airflow, SES, EMR, EC2, spark, Scala, python, lambda, singlestore etc.
Supporting the existing application and working on user access requests for backend / frontend by following the security protocols
Optimizing AWS cloud resources
Working on Tableau / MSTR issues / upgrades and portal enhancements
Work and Technical Expertise:
Essential Skills:
7 to 9 years of IT Experience
2 to 3 years of AWS Cloud experience
2 to 3 years of hands-on experience in Scala, python, spark, lambda, airflow, EMR, SQL
Performance tuning and optimisation skills by using various techniques
Good and crisp communication skills
ETL warehousing advanced concepts
MSTR / Tableau intermediate level expertise
Experience in working in onshore / offshore model
Strong knowledge of backups, restores, recovery models, database shrink operations, Clustering, Database mirroring, Replication
Experience in Performance Tuning, Query Optimization, using Performance Monitor, SQL Profiler and other related monitoring and troubleshooting tools
Work independently based on the mentorship provided by managers
Experience with Information Technology Infrastructure Library (ITIL) framework
Solid skills in decision-making, prioritization and negotiation
Qualifications:
Bachelor's degree in computer science, engineering, or related field (Master?s degree is a plus)
Demonstrated continued learning through one or more technical certifications or related methods
7 to 9 years of IT experience including 4 years of AWS cloud experience
Qualities:
Self-motivated and focused on delivering outcomes for a fast growing team and firm
Able to communicate persuasively through speaking, writing, and client presentations
Able to consult, write, and present persuasively
Able to work in a self-organized and cross-functional team
Able to iterate based on new information, peer reviews, and feedback
Able to work with teams and clients in different time zones
Research focused mindset
Prior experience of working in a large media company would be added advantage
Job Features
Job Category | Developer |
Description
Position: Oracle Golden Gate DBA/Architect
Location: Remote – USA
Type of Employment: Full time
Must-Have:
7-10 years of IT Exp
5-8+ years of Oracle GG DBA experience – MPP and/or Cloud based
3+ yrs of experience of Mongo DB & Cassendra
AWS Redshift administration hands-on experience
Performance tuning of databases using various techniques
Hands-on experience in DB administration tools
Experience on DB migration projects
DB security management
Workload Management, Server Health Check. Stats etc.
Relevant AWS Certification
Understanding of AWS SCT
Experience in working with Dev/IT teams.
Good experience on Database upgrade activities.
Strong knowledge of backups, restores, recovery models, database shrink operations, DBCC commands, Clustering, Database mirroring, Replication.
Experience on troubleshooting collation issues.
Experience on Always high availability (Configuring troubleshooting).
Knowledge of SQL Server tools (Profiler, DTA, SSMS, PerfMon, DMA(data migration assistant ) DMVs, systems-procs)
Experience in Performance Tuning, Query Optimization, using Performance Monitor, SQL Profiler and other related monitoring and troubleshooting tools
Work independently based on the mentorship provided by managers.
Experience with Information Technology Infrastructure Library (ITIL) framework
Solid skills in decision-making, prioritization and negotiation
As per role these should be -
Must to have – (It is still heavy on Oracle DBA activities assuming there is no Oracle DBA in team, but if there is DBA in team last two points can be Nice to have)
5 to 8 years in Oracle GG administration – MPP and/or Cloud based
Experience on troubleshooting collation issues.
Experience on Always high availability (Configuring troubleshooting).
Experience in Performance Tuning, Query Optimization, using Performance Monitor, SQL Profiler and other related monitoring and troubleshooting tools
Hands-on experience in Oracle DB administration - backups, restores, recovery models, database shrink operations, DBCC commands, Clustering, Database mirroring, Replication, security management, Workload Management, Server Health Check. Stats etc.
Good experience on database upgrades and migration.
Nice to Have
3+ yrs of experience of Mongo DB & Cassendra
AWS Redshift administration hands-on experience
Relevant AWS Certification
Understanding of AWS SCT
Experience in working with Dev/IT teams.
Knowledge of SQL Server tools (Profiler, DTA, SSMS, PerfMon, DMA(data migration assistant ) DMVs, systems-procs)
Work independently based on the mentorship provided by managers.
Experience with Information Technology Infrastructure Library (ITIL) framework
Solid skills in decision-making, prioritization and negotiation
Job Features
Job Category | DBA |
Location: MA - Hybrid
Job Type: FTE
Sal: 60K
We are seeking Junior.NET Developer with 3-4 years of experience.
Creates and maintains C# code using MS tools such as MS Visual Studio, MS SQL Server
Studio, Visual Source Safe and Subversion
Leads stakeholder meetings to resolve technical issues and design conflicts
Leads team strategy meetings to resolve environmental technical constraints and policy
decisions
Supports and guides junior developers as needed
Qualifications4-year degree in Computer Science, Engineering, Mathematics, Business, or related
discipline from an accredited university
Technical certifications are a plus
Nice to have:
Eight+ years of experience developing enterprise class web applications in the public or
private sector
Skills
Work experience with the following technologies
o .NET C# – Expert skill required
o HTML 4.0 – Expert skills required
o CSS – Expert skills required
o JavaScript – Strong skills required
o MS SQL Server – Strong skills required
o VB.NET – Strong skills preferred
Work experience with the following tools - Preferred
o Visual Studio 2010 – 2017
o MS SQL Server Studio 2008
o IIS 6.0 – Intermediate skills desired
o Working knowledge of MCSD best practices
Strong understanding of software development using Scrum and similar processes
Strong understanding of BDD and TDD
Expert understanding of the data, business, and presentation layers in web
development
Expert understanding of web architecture
Demonstrated competence avoiding security issues related to web applications
Demonstrated competence with software design patterns and how to leverage
Working knowledge of 508 / WCAG
Experience working with and mentoring junior developers within an Agile environment
Experience developing enterprise business applications
Strong work ethic with a relaxed attitude needed
Job Features
Job Category | Developer |
Location: AZ/United States - Remote
Type of Employment: Full time/Contract
Key Result Areas and Activities:
As a Senior ThoughtSpot Developer, you will be responsible for designing, developing, and
maintaining ThoughtSpot applications and solutions that enable our clients to extract actionable
insights from their data. You will collaborate with cross-functional teams to create intuitive and
interactive data visualization solutions. This role requires a strong understanding of ThoughtSpot's
architecture and capabilities, as well as the ability to translate business requirements into effective
analytics solutions.
Key Responsibilities:
Collaborate with business analysts and stakeholders to gather and understand data
requirements.
Design and develop ThoughtSpot data models and data pipelines.
Create ThoughtSpot worksheets, pinboards, and visualizations to present data insights.
Optimize ThoughtSpot performance for large datasets and complex queries.
Develop custom integrations and extensions to enhance ThoughtSpot functionality.
Troubleshoot and resolve issues related to ThoughtSpot applications.
Provide mentorship and guidance to junior developers.
Work and Technical Experience:
Must-Have Skills:
ThoughtSpot Proficiency: Demonstrated experience with ThoughtSpot, including the ability to create
worksheets, pinboards, and data models.
Data Modelling: Strong understanding of data modelling principles and experience in designing
efficient data models for ThoughtSpot.
SQL and Data Manipulation: Proficiency in SQL and data manipulation skills to prepare data for
visualization.
Nice to have:
ETL: Familiarity with ETL processes and tools for data extraction, transformation, and loading.
ThoughtSpot Administration: Knowledge of ThoughtSpot administration, including user
management, security configurations, and performance optimization.
Problem-Solving: Strong problem-solving skills and the ability to troubleshoot and resolve technical
issues.
Communication: Excellent communication and collaboration skills to work effectively with cross-
functional teams and clients.
Team Leadership: Experience in providing guidance and mentorship to junior developers.
Good-to-Have Skills:
Scripting Languages: Familiarity with scripting languages such as Python or JavaScript for
customizations and automation.
Big Data Technologies: Experience with big data technologies such as Hadoop, Spark, or Hive.
Cloud Platforms: Knowledge of cloud platforms like Databricks, Snowflake, Google Cloud for
ThoughtSpot deployments.
Data Warehousing: Understanding of data warehousing concepts and technologies.
Data Governance: Familiarity with data governance and data quality best practices.
Certifications: ThoughtSpot certifications or relevant industry certifications.
Data Security: Knowledge on PII and masking is desirable.
Agile: Working knowledge on Agile/SAFe
Domain Expertise: Knowledge on Retail Domain KPIs
Qualifications:
Bachelor’s degree in computer science, Information Technology, or a related field.
Minimum of 3-4 years of experience in ThoughtSpot
Minimum 8-10 years of experience on Business Intelligence tools like Tableau, MicroStrategy
or similar and data analytics.
Qualities:
Excellent verbal and written communication
Collaboration skills to work in a self-organized and cross-functional teams
Strong troubleshooting and problem-solving abilities.
Excellent analytical, presentation, reporting, documentation, and interactive skills.
Job Features
Job Category | Developer |
Role: Big Data Developer
Location: New York / New Jersey.
Responsibilities
Designing Hive/HCatalog data model includes creating table definitions, file formats, compression techniques for
Structured & Semi-structured data processing
Implementing Spark processing based ETL frameworks
Implementing Big data pipeline for Data Ingestion, Storage, Processing & Consumption
Experienced AWS Developer, familiar with AWS services, S3, EC2, EMR / databricks, Lambda, Aws Ci/Cd
Enhancing the Talend-Hive/Spark & Unix based data pipelines
Develop and Deploy Scala/Python based Spark Jobs for ETL processing
Strong SQL & DWH concepts
Preferred Background
Function as integrator between business needs and technology solutions, helping to create technology solutions
to meet clients’ business needs
Lead project efforts in defining scope, planning, executing, and reporting to stakeholders on strategic initiatives
Understanding of EDW system of business and creating High level design document and low level implementation
document
Understanding of Big Data Lake system of business and creating High level design document and low level
implementation document
Designing Big data pipeline for Data Ingestion, Storage, Processing & Consumption
Job Features
Job Category | Developer |
Sr. Cloud-Ops Lead
Position: Sr. Cloud-Ops Lead
Location: United States (Preferably NY/NJ Area) | Remote with Ready to Relocate
Type of Employment: Full time
Travel: NA
Key Results Areas:
Be a subject-matter expert as it relates to cloud infrastructure and DevOps practices at CloudOps
Design cloud infrastructure and DevOps solutions, architectures and roadmaps
Evolve and build best practice materials for Infrastructure-as-Code and Configuration Management
Enable development teams to leverage cloud-native architectures with new and existing applications
(examples, GitLab, Vault)
Create, build and grow partnerships with various organizations relevant to the practice.
Governance and report SLAs, resolve time critical issues
Work and Technical Experience:
9+ years of overall experience, to include a minimum of 3 years of experience with Cloud based Data &
Analytics implementations
Must have experience with AWS, Snowflake, server-less infrastructure paradigms
Experience with deploying and managing scalable systems with high availability on AWS
Provide support for AWS systems including monitoring and resolution of issues
Knowledge of networking, monitoring, and logging for cloud services
Experience with Infrastructure as Code tools like Terraform, CloudFormation and ARM
Software development and scripting experience using Python and/or Golang
Interest in containerization technologies such as Docker, Containerd, etc.
Experience with cloud deployment automation and configuration management for AWS
Experience with the AWS services such as Aurora, RDS, S3, Lambda, Elastic Beanstalk, EC2, VPC, SQS,
Cognito, IAM, WAF, Route 53, CloudFront, Code Pipeline, CloudWatch
Knowledge of SQL, and non-relational (NoSQL) databases
Understanding of emerging technologies and end-user needs
Qualifications:
2 of 2 01/19/2021
Masters or Bachelor’s in computer science or similar relevant field
Preferred - AWS Associate/Professional Level Certifications
Experience with Agile methodologies and tools, Continuous Integration & Continuous Deployment tools
and processes
Qualities:
Self-motivated and focused on delivering outcomes for a fast-growing firm
Strong analytical skills with the ability to learn new information quickly
Manage time effectively, ability to multi-task and meet deadlines
Team player with great interpersonal and communication skills
Job Features
Job Category | Developer |
Job Role: Core Java + Big Data
Location: San Francisco, CA/Remote
Hire Type: Fulltime
Working Hours: PST Time Zone
Work Experience: 6-10 Years
Expert level proficiency in writing production quality
code in Core Java with Scala & Python as optional
Excellent knowledge and proven experience of common
design patterns and architectures used in distributed
data processing, data quality, data warehouses,
metadata management etc.
In-depth knowledge of and experience working on
distributed data processing systems using the open-
source technologies like MapReduce, Hive, Tez, Spark
and Kafka (Spark is must)
Good to have - experience building data ingestion
pipelines on public cloud infrastructure, preferably AWS
Good to have - experience working with EMR,
Databricks and Snowflake
Job Features
Job Category | Developer |
JOB DESCRIPTION
- Work location : remote
- Hiring type : Contract
- Contract duration : 4-6 months
Job Features
Job Category | Developer |
A)Job Description :
- The candidate will work with the project team, team leaders, project delivery leads, and client stakeholders to create the right solutions based on business needs.
- The candidate will define components of SAP Convergent Mediation (CM).
- The successful candidate will design and develop interfaces for applications to collect data using design methodologies and tool-sets, developing software (JavaScript) in CM, and conducting tests.
B)Required Experience :
- Solid understanding of Saas-based application models: subscription models and consumption models
- Have 3+ years system analysis, solution design and system architecture experience with demonstrated experience and successful projects in large enterprises
- Previous experience designing and engineering highly complex application components and integrating software packages using various tools
- Possess a mix of consultative skills, business knowledge, and technical expertise to effectively integrate technology and achieve business outcomes
- Define customer requirements and perform gap analysis of standardized processes within the SAP BRIM solution
- Functional expertise in several BRIM process areas including Subscription Order Management, Convergent Mediation, Convergent Charging, Convergent Invoicing, FI-CA and Customer Financial Management and overall application architecture of BRIM.
- Convergent Invoicing experience to include: Consumption Items, Billable Items, Billable Item Class Configuration, billing item selection, aggregation, collective billing and reversal principles, Invoicing Integration with Tax and posting in Accounts Receivable (AR).
C)Qualifications :
- Experience with the SAP S/4HANA Subscription Order Management module is a plus.
Experience with SAP CRM Subscription Order Management module is an acceptable substitute
- Minimum 5 years functional/technical proficiency with at least 2 full lifecycle SAP implementations Minimum 4 years of experience in crafting / developing / maintaining highly scalable and resilient system in the SaaS business models
- Minimum of 1 year of experience in SAP BRIM or Hybris Billing, solution components such as Subscription Billing or Subscription Order Management (SOM) or Convergent Mediation (CM) or Convergent Charging (CC) or Convergent Invoicing (CI) or contract accounting (FICA) - Minimum of 1 end-to-end SAP BRIM or Hybris Billing implementations
- 1+ year(s) w/SAP Convergent Mediation
- 4+ years in SAP BRIM Ecosystem
- Skilled in J2EE (EJB, Servlet/JSP, XML); overview of BRIM architecture
- Technical hands-on experience in at least one of the following: Subscription billing or Convergent Mediation or Convergent Charging or Convergent Invoicing or contract accounting (FICA) - Earned SAP Hybris Billing/ BRIM Certifications
- Hire Type : Contractor
- Visa Type : H1B
- Mode of Work : Remote / Hybrid
- Location : East Coast/ West Coast
Job Features
Job Category | Architect |