Job Archives
Remote NY/NJ
FTE
Responsibilities:
Act as Product Owner for Applications built on AWS/MSTR landscape
Study existing technology landscape and understand current application workloads
Understand and document technical requirements from clients
Define Migration strategy to move application to cloud
Assist Architect to design the overall Virtual Private Cloud VPC environment including server
instance, storage instances, subnets, availability zones, etc
Assist Architect to design the AWS network architecture including VPN connectivity between
regions and subnets
Design the HA / DR strategies
Set up process, services and tools around cloud
Oversee build of the environment
Execute migration plan
Leverage appropriate AWS services
Validate the environment to meets all security and compliance controls
Provide top-quality solution design and execution
Provide support in defining the scope and sizing of work
Engage with clients to understand strategic requirements
Responsible for translating business requirements into technology solutions
Work with domain experts to put together a delivery plan with and stay on track
Utilize AWS & Big Data technologies to design, develop, and evolve scalable and fault-tolerant
distributed components
Responsible for the design and execution of abstractions and integration patterns (APIs) to
solve complex distributed computing problems.
Retail Industry Experience – good to have
Requirements:
Bachelor’s Degree or MS in Engineering or equivalent
In depth knowledge of key AWS services like EC2, S3 etc.
Proven experience assessing clients workloads and technology landscape for Cloud suitability,
develop business case and Cloud adoption roadmap
Proven knowledge of AWS platform and tools
Experience in defining new architectures and ability to drive project from architecture
standpoint
Knowledge of best practices and market trends pertaining to Cloud and overall industry to
provide thought leadership (seminars, whitepapers etc.,) and mentor team to build necessary
competency
Ability to quickly establish credibility and trustworthiness with key stakeholders in client
organization.
Experience provisioning and spinning up AWS Clusters
Excellent verbal, written and presentation skills.
Certification on AWS architecture desirable
Technology Requirements:
Knowledge of AWS and cloud technologies are must like AWS Glue, EMR, S3, Lambda, streaming
platforms, data lakes patterns and security setup.
Applicants are required to have knowledge in Big Data Tools and Technologies in following
areas: Linux Expertise, Ec2, Big data and Hadoop Ecosystem, working knowledge of tools like
Hive, Spark, Kafka, Streaming platform, Oozie, HDFS etc.
Required to have development skills in scripting language such as Java, Scala, Python or shell
scripting, AWS Lambda.
Design and Develop Understand Cloud end2nd design and various architectural components for
a pipeline. Able to do develop and orchestration of data flow pipeline with SQL or scripting.
In addition to technical skills, need to possess knowledge of various phases of software
development life cycle in cloud and source control tools like GitHub.
Ability to work with large data sets: Big Data involves large data sets, so applicants must be able
to work with highly diverse data in multiple types and formats, and sources in cloud.
Knowledge or experience in setting up data lake and data lake principles.
Good to have AWS Certification and any other architecture achievements.
Job Features
Job Category | Lead |
Job Role: Core Java + Big Data
Location: San Francisco, CA/Remote
Hire Type: Contractor or Fulltime
Working Hours: PST Time Zone
Work Experience: 6-10 Years
Expert level proficiency in writing production quality code in Core Java with Scala & Python
as optional
Excellent knowledge and proven experience of common design patterns and architectures
used in distributed data processing, data quality, data warehouses, metadata management
etc.
In-depth knowledge of and experience working on distributed data processing systems using
the open-source technologies like MapReduce, Hive, Tez, Spark and Kafka (Spark is must)
Good to have - experience building data ingestion pipelines on public cloud infrastructure,
preferably AWS
Good to have - experience working with EMR, Databricks and Snowflake
Job Features
Job Category | Developer |
Location: SFO (CA)
- Expert level proficiency in writing production quality code in Core Java with Scala & Python as optional
- Excellent knowledge and proven experience of common design patterns and architectures used in distributed data processing, data quality, data warehouses, metadata management etc.
- In-depth knowledge of and experience working on distributed data processing systems using the open-source technologies like MapReduce, Hive, Tez, Spark and Kafka (Spark is must)
- Good to have - experience building data ingestion pipelines on public cloud infrastructure, preferably AWS
- Good to have - experience working with EMR, Databricks and Snowflake
Job Features
Job Category | Developer |
Location LA (California)
Responsibilities:
- Acting as MicroStrategy expert to advice customers in defining MicroStrategy Roadmap, Security guidelines and MicroStrategy architecture for Complex enterprise systems.
- Working directly with our clients, prospects and partners to deliver high quality business intelligence solutions to our customers
- Working with the management on creating project estimates and resource planning
- Engage with the Solution Designer in the reviews and validation of the detailed technical design for business applications to ensure alignment with business requirements
- Understand technical specifications and create report design documents
- Review designs, code and testing with the project team as appropriate
- Developing and testing detailed functional designs for business solution components and prototypes
- Incorporate best practices and standards in metadata and report/dashboard development
- Follow development standards and effectively raise critical issues with the client
Preferred Background:
- 8+ years of experience on MicroStrategy Desktop, Architect, Web
- Demonstrated experience with translating business requirements into data analysis and reporting solutions
- Experience on Intelligent Cube Development, Custom widget, visualization development and implementation
- Hands of experience on Project creation, schema objects creation, Development of queries, documents & dashboards with flex based widgets
- Experience using Enterprise manager monitoring, auditing and performance tuning and VLDB settings
- Experience in development & integration which will include setting up configuration management, developing components, peer review, creating unit test plans & performing unit testing
- Must have strong SQL and RDBMS (Data model) skills
Job Features
Job Category | Architect, Developer, Lead |
Skills:
- Project manager with 8-10 years of experience driving BI projects in agile methodology using Scrum framework
- Strong knowledge of Scrum theory, rules and practices.
- Expert knowledge in estimating, planning for BI projects
- Should have knowledge & experience in using Agile techniques like: User Stories, iterative development, Continuous Integration, Continuous Testing, Incremental delivery etc.
- Knowledge about tasks, backlog tracking, burndown metrics, velocity, user stories etc.
- Familiar with common development practices, Service oriented environments, and Agile practices.
- Experience/ knowledge in data integration and APIs
- Knowledge of SQL and Snowflake
Responsibilities:
- Responsible for managing the scrum process to deliver BI Reports following Agile methodology.
- Coordinate between data & reporting teams in a multi-location environment to ensure delivery of reports & dashboards to end users
- Responsible to remove the impediments for the scrum team
- Arrange daily stand-up meetings, facilitate meetings, schedule meetings, demo and decision-making processes in order to ensure quick inspection and proper use of reports & dashboards
- Helps product owner to make the product backlogs in good shape and make them ready for the next sprint.
- Organizes and facilitates the sprint planning meeting, daily scrum meetings & retrospective meetings
- Facilitate team for better creativity and improves the efficiency of the development team
Job Features
Job Category | Administrator, Project Manager |
Location: Phoenix or Remote
- Informatica Administrator Hands-on experience in Installing and configuring PowerCenter, EDC, Meta Query Upgrade PowerCenter, Meta query, EDC Apt
- knowledge of clustering Apply Hot fixes and EBFs Good Knowledge with Informatica EDC Capacity Planning for Informatica servers.
- Perform analysis and fix of L2 & L3 issues in the Informatica Platform. Back up data recovery and service restoration management Informatica Platform(EDC, PowerCenter)
- Support & Enhancements Propose best practices to enhance the application stability.
- Sound knowledge in writing bash/shell scripts Implement and enhance monitoring to provide visibility into the health of the system
- Analyze system usage trend and perform capacity planning Support and execute preventive maintenance activities Daily//Monthly repository backup Create and manage quality documents as needed Resolve Tickets and incidents reported by platform users Co-ordination with vendor and application during hardware/software upgrades Provide technical support
- Provide acceptance and regression testing environments for maintenance activities and enhancements created by the support team
- Reducing maintenance needs, improve reliability, reduce execution time, performance tuning and monitoring. Automate recurring tasks in admin area
- Able to gather requirements and configure EDC metadata resources Debug issues with Data Lineage irregularities.
- Experience and Knowledge With Informatica IICS Able to explore Informatica Knowledge Base and raise support cases as required
Job Features
Job Category | Administrator |
Location : East coast
Work hours : EST
Position type : Fulltime /Contract
Mode of work : remote for initial months
Search key words : Data modelling, Data Architect, Snowflake certificate
This position requires a sound knowledge on Snowflake, AWS Cloud stack, Big Data
technologies along Spark, Scala, Hive etc., Knowledge on database ETL tools,
Having knowledge on BI tools would be an added advantage (Optional)
Responsibilities
Analyze, reproduce, and resolve customer problems of varying complexity
across data ecosystem using Snowflake, Big Data and Cloud environments,
including Hadoop, AWS
Document problems and deliver solutions to end customer on data
requirements.
Work with multiple product teams across the data landscape and effectively
build the data clean room for customer with right access
Coordinate with Application teams (Upstream / Downstream) to ensure that the
Snowflake Jobs are running as per the schedule and work with teams
Adopt and implement best practices emphasizing Agile and DevOps.
Troubleshoot production issues and steer them to resolution along with other
teams. Maintain and Secure the cloud environments, Work with Client security teams
for data / application security requirements
Manage AWS infrastructure, User provisioning requests.
Preferred Background
Minimum experience of 6 to 8 years in data management, Modeling with 2
years of Snowflake development experience
Report on SLAs & KPIs on weekly / monthly basis
Snowflake certification Certification is a preferred one for the Associate
BS Degree in IT, MIS or business related functional discipline.
Required Skills
Strong self-leadership and ability to work independently and manage conflict
Snowflake (Must Have)
Experience working with Snowflake
Experience in handling PII and other data scenarios
Experience is handling Data compliances
Experience with Data encryption features
Overall understanding on Snowflake Architecture and features
Data modelling- Must have
Must have extensive experience in data modelling skills
Create Datamodel that supports business analytics requirements
ETL Concepts- Through with ETL concepts like Normalization, SCD types,
Change data captures.
Strong SQL knowledge is must
Programming Language (Nice to have):
Python
Cloud AWS Services should know how–
Knowledge on reporting tools is plus
Tableau
Power BI
ETL Concepts- Through with ETL concepts like Normalization, SCD types, Change data
captures.
Databases (Good to have):
Teradata, Snowflake (nice to have)
Working knowledge with Control M
Others –
Jenkins
Github
Jira
Very good in Communication & working with multiple teams and managing client
engagements
Location
NY/ NJ Area - Need to relocate once offices open up
Employment Type
FTE
VISA Status
VISA Independent preferred considering the joining is in 2 weeks.
Work Time zone:
EST
Responsible for designing, documenting, and implementing integrations between AWS/
GCP and financial applications like SAP, Salesforce, BPC etc. using appropriate methods.
Capabilities to engage in the healthy dialogue with the business stakeholders & deep
dive into finance domain. Build the corporate finance domains & it's alignment to the
mesh for ready consumption.
Qualifications
Required Education
Bachelor’s Degree
Required Experience
Minimum 7+ years of data engineering experience in data pipelines & is well versed
with the DI concepts
Minimum 2 years of hands-on experience with Cloud using AWS (primary) & GCP
(secondary)
Minimum 4 years of experience working in the finance domain
Experience in building integrations with the financial apps like SAP, Salesforce, Blackline
etc.
Should have good consulting skills combined with working knowledge of AWS stack
and architecture. Should be able to give advice to internal teams and stakeholders
on architecture best practices when it comes to real-time processing.
Self-starter with an in-depth hands-on work experience with multiple large-scale
implementations on AWS & GCP
Knowledge of cloud architecture standards and framework in alignment with
enterprise architecture standards and framework
Essential Duties and Responsibilities
Works with a team of Security Engineers to architect AWS solutions with the
responsibility to ensure the cloud platform maintains a secure overall cohesive
integration with itself, partner systems and solutions.
Build integrations between AWS or GCP and financial systems like SAP, BPC, Blackline
etc using AWS Lambda, Appflow or any other integration methods.
Designs cloud infrastructure, virtual private cloud link, virtual private network,
relational database services, auto scaling, and computing resources to meet
requirements.
Lead data solution architecture definition of complete Enterprise Service solutions
covering the lifecycle from data ingestion, processing, transforming and aligning it to
the Data Mesh strategy
Interact heavily with the users belong to the financial domain to develop data as a
product
Provides recommendations to optimize Cloud usage costs by analyzing usage
patterns and trends while providing recommendations for consolidating instance
types based on demand across all billing accounts.
Maintains an understanding of advancing cloud capabilities, IT and business needs,
while identifying and proposing cost and functional efficiency options for the
collaborative decisions and actions.
Develops cloud architecture standards and framework in alignment with enterprise
architecture standards and frameworks.
Manage and document architecture requirements for interactions between IaaS and
SaaS in a hybrid and multi-cloud environment.
Understanding of Identity and Access Management, user and system authentication
methods and certificates, tokens, and certificate management.
Coordinate with developers, cloud platform engineers, the business and Cloud
Service Provider on architecture design requirements for services.
Preferred Experience
Knowledge and usage of the platform is essential. Not only knowledge of the
architecture, but ability to educate business users and move them to the cloud using
necessary tools
Knowledge and experience of Cloud computing (storage, compute, networks, etc.)
Experience designing and implementing scalable Cloud architecture (AWS & GCP)
Cloud storage & database skills, knowledge and experience designing and using
Cloud patterns experience
Experience in designing and implementing scalable, distributed systems leveraging
cloud computing technologies
Proficiency with Infrastructure as Code (IaC) (CloudFormation, Terraform, ARM
templates)
Ability to work independently, manage small engagements or parts of large
engagements.
Key Words to search:
AWS, Data Engineering, Finance domain, SAP, Appflow
Job Features
Job Category | Developer |
What you’ll do?
• You will drive the design & architecture of data platform,
infrastructure, and tools including but not limited to streaming
infrastructure, streaming & batch processing, metadata management, data
warehouse, data quality, data security etc.
• You will provide clear technical guidance to the team based on your own
working experience and industry best practices to deliver a modern data
processing platform that includes stream & batch processing powered by
Spark, data warehousing powered by Snowflake, data quality and reliability,
metadata management capabilities.
• You will build POCs to evaluate and make right technology choices.
• You will build prototypes to provide guidance for the rest of the data
engineering team.
• You will work closely with product management team with respect to
feature prioritization, delivery, high level estimates of effort and high and
mid-level designs while not losing sight of technical debt.
• You will collaborate and communicate effectively within the team and
across teams to deliver impactful data platform and services.
• You will understand how LendingClub’s data is used and what it all means.
Who You Are:
• You are passionate designing and leading the implementation of resilient,
distributed software platforms and large-scale data infrastructures.
• You have excellent listening skills and empathetic to others.
• You believe in simple and elegant solutions and give paramount importance
to quality.
• You’ve a track record of shipping working software fast and reliably.
What you’ll need to succeed:
Must have skills:
• B.S., M.S. or Ph.D. degree in computer science or a related field or
equivalent work experience.
• Expert level proficiency in writing production quality code, preferably Scala
& Python.
• Excellent knowledge and proven experience of common design patterns
and architectures used in distributed data processing, data quality, data
warehouses, metadata management etc.
• In-depth knowledge of and experience working on distributed data
processing systems using the open-source technologies like MapReduce,
Hive, Tez, Spark and Kafka.
• Experience building data platforms on public cloud infrastructure,
preferably AWS.
• Bonus points if experience working with EMR, Databricks and Snowflake.
Nice to have skills:
• Working knowledge of open-source ML frameworks and end-to-end model
development life cycle.
• Previous working experience with running containers (Docker/LXC) in a
production environment using one of the container orchestration services
(Kubernetes, Docker Swarm, AWS ECS, AWS EKS).
Job Features
Job Category | Architect |
Location: United States (Preferably NY/NJ Area) | Remote with Ready to Relocate
Type of Employment: Full time
Travel: NA
Key Results Areas:
Be a subject-matter expert as it relates to cloud infrastructure and DevOps practices at CloudOps
Design cloud infrastructure and DevOps solutions, architectures and roadmaps
Evolve and build best practice materials for Infrastructure-as-Code and Configuration Management
Enable development teams to leverage cloud-native architectures with new and existing applications
(examples, GitLab, Vault)
Create, build and grow partnerships with various organizations relevant to the practice.
Governance and report SLAs, resolve time critical issues
Work and Technical Experience:
9+ years of overall experience, to include a minimum of 3 years of experience with Cloud based Data &
Analytics implementations
Must have experience with AWS, Snowflake, server-less infrastructure paradigms
Experience with deploying and managing scalable systems with high availability on AWS
Provide support for AWS systems including monitoring and resolution of issues
Knowledge of networking, monitoring, and logging for cloud services
Experience with Infrastructure as Code tools like Terraform, CloudFormation and ARM
Software development and scripting experience using Python and/or Golang
Interest in containerization technologies such as Docker, Containerd, etc.
Experience with cloud deployment automation and configuration management for AWS
Experience with the AWS services such as Aurora, RDS, S3, Lambda, Elastic Beanstalk, EC2, VPC, SQS,
Cognito, IAM, WAF, Route 53, CloudFront, Code Pipeline, CloudWatch
Knowledge of SQL, and non-relational (NoSQL) databases
Understanding of emerging technologies and end-user needs
Qualifications:
Masters or Bachelor’s in computer science or similar relevant field
Preferred - AWS Associate/Professional Level Certifications
Experience with Agile methodologies and tools, Continuous Integration & Continuous Deployment tools
and processes
Qualities:
Self-motivated and focused on delivering outcomes for a fast-growing firm
Strong analytical skills with the ability to learn new information quickly
Manage time effectively, ability to multi-task and meet deadlines
Team player with great interpersonal and communication skills
Job Features
Job Category | DevOps |
This position leads and executes product delivery in partnership with product owners,
architects, and lead engineers as part of a Scrum Team. In this position, you will help product
owners refine business and technical requirements, perform analysis on the system or
capability, test outcomes prior to launch, and partner with operations teams to ensure
supportability of the capability. The technical analyst will analyze, discover, document, and
act as a subject matter expert on the details of key customer facing capabilities, as well as
understand and document the data flow between systems that enable application
functionality. The role will work directly with our various internal and external stakeholders
including business partners, other scrum or IT teams, and external vendors.
Your goals and responsibilities will include:
Understand the business operations in detail; knowledgeable of current and possible
future practices, trends, and information affecting multiple business functions and
aspects of how they relate to other areas.
Support product owners as needed with the refinement of business requirements and
functional design specifications. Identify gaps between business requirements and
application capabilities and recommend options.
Act as a subject matter expert for functional and system capabilities. Analyze,
discover, document, and maintain functional and system designs in Customer
knowledgebase.
Understand and apply platform configurations
Worked on ETL technologies( Informatica Power Center)
Experience with Snowflake and Data Virtulisation( Denodo etc)
Responsible for design and development of Mappings, mapplets, sessions workflows and
schedule them.
Responsible for handling SCD’s in project.
Responsible for database design process, logical design, physical design, star schema,
snowflake schema etc.
Design connections objects and be well versed with session and workflow properties.
3-8 years of progressive experience with Data Warehouse and Informatica
Good programming skills - quick and self-learner and has good experience in
Informatica, ETL
Develop solution in highly demanding environment, with high performance and
availability and provide hands on guidance to other team members
Collaborate with product development teams and senior designers to develop
architectural requirements to ensure client satisfaction with Solution.
Assess requirements for completeness and accuracy and decide if they are actionable
for the ETL team,
Conduct impact assessment and determine size of effort based on requirements.
Develop full SDLC project plans to implement ETL solution and resource
identification.
Perform as active, leading role in shaping and enhancing overall ETL Informatica
architecture. Identify, recommend and implement ETL process and architecture
improvements.
Assist and verify design of solution and production of all design phase deliverables.
Identify bugs when developers are not able to do so.
Manage build phase and quality assure code to ensure fulfilling requirements and
adhering to ETL architecture.Resolve difficult design and develop issues.
Write SQL queries to query databases as needed for analysis into issues or system
functionality to inform design.
Understand system integration methods and data flow
Actively engage with a scrum team of developers, architects, and analysts in
delivering business requirements, technical requirements and testing for the
implemented functionalities. Work closely with the product owner throughout
delivery to ensure solution meets business needs.
Coordinate test activities with QA engineers, including: identifying task
dependencies, test schedules, creation of test data, utilization of test environments,
perform system testing.
Partner with IT Operations as needed to support resolution of issues as needed and
accept responsibility for the resolution.
As issues occur, provide data analysis and audits to identify and address root cause of
business problems and to make better informed decisions.
Experience with scheduling tools like Tidal
MUST HAVES:
Bachelor’s degree - Math, Science, Computer Science, or Business preferred or
equivalent work experience.
5+ years of experience analyzing data and understanding data ecosystems in a retail or
comparable environment.
SQL and data verification
Experience with data flow validation/ analysis through integrated systems (ETL
Testing)
Excellent analysis/troubleshooting skills, effective partnering/relationship building
skills
Experience for managing vendors to deliver components of projects
Understanding of Agile/Scrum frameworks and Waterfall methodology
Should be customer-oriented, mission focused, and a quick, continuous learner
Excellent verbal and written communication skills
Ability to communicate effectively between business and technical teams
Ability to work with and through others to resolve issues
Proven ability to work under tight deadlines and the ability to deal with ambiguity
Experienced oral and written communicator with good interpersonal skills
Positive attitude and solid work ethic
Job Features
Job Category | Developer |
Big Data Developer in New York / New Jersey.
Please refer below job description.
Responsibilities
Designing Hive/HCatalog data model includes creating table definitions, file formats, compression techniques for
Structured & Semi-structured data processing
Implementing Spark processing based ETL frameworks
Implementing Big data pipeline for Data Ingestion, Storage, Processing & Consumption
Experienced AWS Developer, familiar with AWS services, S3, EC2, EMR / databricks, Lambda, Aws Ci/Cd
Enhancing the Talend-Hive/Spark & Unix based data pipelines
Develop and Deploy Scala/Python based Spark Jobs for ETL processing
Strong SQL & DWH concepts
Preferred Background
Function as integrator between business needs and technology solutions, helping to create technology solutions
to meet clients’ business needs
Lead project efforts in defining scope, planning, executing, and reporting to stakeholders on strategic initiatives
Understanding of EDW system of business and creating High level design document and low level implementation
document
Understanding of Big Data Lake system of business and creating High level design document and low level
implementation document
Designing Big data pipeline for Data Ingestion, Storage, Processing & Consumption
Job Features
Job Category | Developer |
Remote East Coast - FTE Only
Duties, Activities, Responsibilities and Competencies:
1. Work closely with the project team members and assist in the full range of project
software activities (requirements definition, design software, develop, deploy and
document).
2. Strong communication skills to effectively communicate with project members, including
project managers, developers, and business-line resources assigned to the project.
3. Focus on developing a REST API on top of a data repository that will be the source of
data for a UI and data retrieval by multiple 3 rd Parties.
Preferred Qualifications:
1. 5+ years of experience developing and implementing REST APIs.
2. Skilled developer with expertise in C# and .Net Core.
3. Experience with Amazon Redshift or PostgreSQL a plus.
4. AWS native technologies such as API Gateway, Lambda, S3, SQS, CloudWatch and Step
Functions.
5. Experience designing, developing, and deploying large-scale customer facing
applications.
6. Experience with Security frameworks related to API security.
7. Experience in API layer like security, custom analytics, throttling, caching, and logging.
8. Ability to troubleshoot services in complex distributed environments.
9. Experience in writing REST API documentation.
10. Experience with version control tools, such as Git
11. Energy industry a plus.
Job Features
Job Category | Developer |
Remote East Coast - FTE/Contract
This position leads and executes product delivery in partnership with product owners,
architects, and lead engineers as part of a Scrum Team. In this position, you will help product
owners refine business and technical requirements, perform analysis on the system or
capability, test outcomes prior to launch, and partner with operations teams to ensure
supportability of the capability. The technical analyst will analyze, discover, document, and
act as a subject matter expert on the details of key customer facing capabilities, as well as
understand and document the data flow between systems that enable application
functionality. The role will work directly with our various internal and external stakeholders
including business partners, other scrum or IT teams, and external vendors.
Your goals and responsibilities will include:
Understand the business operations in detail; knowledgeable of current and possible
future practices, trends, and information affecting multiple business functions and
aspects of how they relate to other areas.
Support product owners as needed with the refinement of business requirements and
functional design specifications. Identify gaps between business requirements and
application capabilities and recommend options.
Act as a subject matter expert for functional and system capabilities. Analyze,
discover, document, and maintain functional and system designs in Customer
knowledgebase.
Understand and apply platform configurations
Worked on ETL technologies( Informatica Power Center)
Experience with Snowflake and Data Virtulisation( Denodo etc)
Responsible for design and development of Mappings, mapplets, sessions workflows and
schedule them.
Responsible for handling SCD’s in project.
Responsible for database design process, logical design, physical design, star schema,
snowflake schema etc.
Design connections objects and be well versed with session and workflow properties.
3-8 years of progressive experience with Data Warehouse and Informatica
Good programming skills - quick and self-learner and has good experience in
Informatica, ETL
Develop solution in highly demanding environment, with high performance and
availability and provide hands on guidance to other team members
Collaborate with product development teams and senior designers to develop
architectural requirements to ensure client satisfaction with Solution.
Assess requirements for completeness and accuracy and decide if they are actionable
for the ETL team,
Conduct impact assessment and determine size of effort based on requirements.
Develop full SDLC project plans to implement ETL solution and resource
identification.
Perform as active, leading role in shaping and enhancing overall ETL Informatica
architecture. Identify, recommend and implement ETL process and architecture
improvements.
Assist and verify design of solution and production of all design phase deliverables.
Identify bugs when developers are not able to do so.
Manage build phase and quality assure code to ensure fulfilling requirements and
adhering to ETL architecture.Resolve difficult design and develop issues.
Write SQL queries to query databases as needed for analysis into issues or system
functionality to inform design.
Understand system integration methods and data flow
Actively engage with a scrum team of developers, architects, and analysts in
delivering business requirements, technical requirements and testing for the
implemented functionalities. Work closely with the product owner throughout
delivery to ensure solution meets business needs.
Coordinate test activities with QA engineers, including: identifying task
dependencies, test schedules, creation of test data, utilization of test environments,
perform system testing.
Partner with IT Operations as needed to support resolution of issues as needed and
accept responsibility for the resolution.
As issues occur, provide data analysis and audits to identify and address root cause of
business problems and to make better informed decisions.
Experience with scheduling tools like Tidal
MUST HAVES:
Bachelor’s degree - Math, Science, Computer Science, or Business preferred or
equivalent work experience.
5+ years of experience analyzing data and understanding data ecosystems in a retail or
comparable environment.
SQL and data verification
Experience with data flow validation/ analysis through integrated systems (ETL
Testing)
Excellent analysis/troubleshooting skills, effective partnering/relationship building
skills
Experience for managing vendors to deliver components of projects
Understanding of Agile/Scrum frameworks and Waterfall methodology
Should be customer-oriented, mission focused, and a quick, continuous learner
Excellent verbal and written communication skills
Ability to communicate effectively between business and technical teams
Ability to work with and through others to resolve issues
Proven ability to work under tight deadlines and the ability to deal with ambiguity
Experienced oral and written communicator with good interpersonal skills
Positive attitude and solid work ethic
Job Features
Job Category | Developer |
Remote East Coast - FTE/Contract
Solution Architect (Azure) candidate will be on the Data & Analytics team and working with 11
others.
This architect will be responsible for managing the strategic partner coming in and advising
Campbell on the infrastructure as well as making sure that the development is appropriate, modern
and well governed. First, the Solution Architect will be designing the cloud environment, then the
strategic partner that wins the RFP will come in with accelerators to migrate the data products into
Azure. Our Solution Architects needs to understand that what that means, how do we manage that,
what do we need to know in order to advise the partners. Currently we are utilizing one of their
current partners before the strategic partner gets in and then we are setting up Azure data lake
storage and data bricks. They will be doing 3 pilots to make sure the architecture works in the
environment. This candidate will be spending 30% of their time on the cloud pilot, making sure
everything is moved properly and there is proper documentation, 25% will be spent upskilling and
training the data engineers and architects on how to set up data products in Azure, 20% will be
spend doing documentation and SOPs and the last 30% will be spent looking at upcoming projects
and solutioning/ strategizing. The ideal candidate needs to have deep understanding of Azure Cloud
as well as how to develop in ADLS, and have a solution based mindset
Need Senior resource - see details above.
KEY MUST HAVES/Experiences as a Solutions Architect:
-15+ years’ experience as a solutions architect
-5-7+ years’ experience with expert level knowledge working as a Solution Architect in Azure cloud
-Background as a data engineer/ developer utilizing Azure ELT / ETL technologies - ADF /
Databricks etc.
-Understanding of Azure/ ADF and Informatica
-Experience working in both Agile and Waterfall methodologies
-Strong communication skills - verbal and written
- Ability to work with and manage partners
Job Features
Job Category | Architect |