Job Archives
Job Title: Senior Acquisition Analyst
Location: Springfield, VA
Clearance: Candidate must already possess an active TS/SCI
The Acquisitions Analyst will manage assigned program and/or contract procurement efforts that
include assisting PMO's in preparing for acquisition and/or procurement activities, defining and
developing necessary acquisition documentation, identifying and managing a program and
contract schedule, identifying program and/or procurement issues and reporting readiness status.
Job Functions:
• Plan and manage assigned Acquisition Governance in support of Component Acquisition
Executive, Senior Procurement Executive and the Executive Secretariat.
• Perform quality reviews over completeness and accuracy of Acquisition Governance process
documentation and provide guidance and feedback to address weaknesses and gaps.
• Review and assess current acquisition governance processes and routinely make suggestions
for innovative and more efficient methods to constantly improve GOVERNMENT's acquisition
processes.
• Document, monitor, manage and maintain all acquisition documentation, Acquisition
Documentation list, acquisition policies and instructions, SOPs, process flow, training materials
and all required historical documentation.
• Train and assist Government acquisition teams through the acquisition governance process to
include documenting meeting minutes, tracking action items and serving as a technical expert to
ensure GOVERNMENT Acquisitions align with GOVERNMENT mission objectives.
• Collect, document and provide feedback on lessons learned throughout the acquisition process
and implement an effective process to report on metrics.
• Maintain Acquisition Governance calendars.
• Coordinate with PM, OCS and DCQ and publishes 3-month calendar and updated 24-month
ACB and CRB look ahead(s) weekly.
• Perform logistics for working group and board meetings including scheduling, calendar
management and preparing the conference rooms for meetings.
• Prepares and delivers read ahead books for QPR, ACQWG, ACB, SCRM SSC, SCRM WG and
CRB executives.
• Develop and record all events including notes, minutes, and actions for QPR, SCRM SSC,
SCRM WG, ACQWG, ACB and CRB.
• Creates and maintains Acquisition Governance new employee orientation books and reference
books. • Maintain the CAE and Acquisition Governance databases, workbooks, SharePoint sites,
and websites for correctness and currency of data.
• Lead data and information gathering to capture and document acquisition processes.
• Gather research and provide recommendations, process improvement opportunities, and
planning to support the Component Acquisition Executive's (CAE) Acquisition Workforce
(AWF) Team.
• Lead project activities as required.
• Utilize existing acquisition expertise and coordinate with subject matter experts to capture and
develop organizational process assets in a variety of content formats (MS Word documents,
Power Point presentations, etc.), as directed by the government.
• Integrate materials into the GOVERNMENT Learning Management System, following
workflows defined by the government.
• Document, monitor, manage and maintain all assigned acquisition documentation, workflows,
acquisition policies and instructions, SOPs, training materials and all required historical
documentation and official record keeping.
Qualifications:
Bachelor’s degree or higher in Business Management or equivalent work experience
5 years of experience developing acquisition and procurement documentation such as
Acquisition Strategy, Acquisition Plan, RFPs, Source Selection Plans or other necessary
acquisition documentation
2 years of experience executing government acquisitions and/or acquisition-specific
support involving the preparation of, or responding to acquisition and procurement
elements, such as acquisition strategies, acquisition plans, Requests For Information /
Proposals (RFI/RFP), source selection plans, and others elements of documentation used
throughout the acquisition life cycle
Certification in program management and//or contracting or equivalent. (i.e., DAWIA
certifications Program Management (level I or II)
Job Features
Job Category | Acquisition Analyst |
Government client.
Must Have:
Bachelor’s degree in relevant field of study. MBA preferred.
8 years of relevant experience in Accounting or Auditing.
Knowledge of Federal Government and DOE accounting policies and procedures.
Knowledge of Federal Acquisition Regulations (FAR) and the DOE FAR Supplement
(DEAR).
Proficiency with MS Office tools.
Excellent verbal and written communications skills.
U.S. citizenship.
Job Duties:
Complete post-payment invoice reviews consistent with Finance procedures and processes.
Conduct independent financial and labor related assessments that focus on compliance
requirements including planning, scheduling, and coordination of reviews.
Conduct contractor labor floor checks to test the reliability of employee time records,
verifying that employees are actually at work, they are performing in assigned job
classifications, and that time is charged to the proper cost objective.
Complete Internal Control reviews and report in assessment quality and format provided by
Finance.
Present independent reviews, assessment plans, and audit results to the Finance Division.
Ensure review plans and activities meet established milestones.
Perform unannounced contractor labor floor checks as part of a team based
Job Title: .NET Developer
Location: Washington, DC / Remote
Clearance: US Citizen / Permanent Resident
We are seeking Senior .NET Developer with 8 years of experience.
Creates and maintains C# code using MS tools such as MS Visual Studio, MS SQL Server
Studio, Visual Source Safe and Subversion
Leads stakeholder meetings to resolve technical issues and design conflicts
Leads team strategy meetings to resolve environmental technical constraints and policy
decisions
Supports and guides junior developers as needed
Qualifications4-year degree in Computer Science, Engineering, Mathematics, Business, or related
discipline from an accredited university
Technical certifications are a plus
Eight+ years of experience developing enterprise class web applications in the public or
private sector
Skills
Work experience with the following technologies
o .NET C# - Expert skill required
o HTML 4.0 – Expert skills required
o CSS – Expert skills required
o JavaScript – Strong skills required
o MS SQL Server – Strong skills required
o VB.NET – Strong skills preferred
Work experience with the following tools
o Visual Studio 2010 – 2017 – Experience Required
o MS SQL Server Studio 2008 – Experience Required
o IIS 6.0 – Intermediate skills desired
o Working knowledge of MCSD best practices
Strong understanding of software development using Scrum and similar processes
Strong understanding of BDD and TDD
Expert understanding of the data, business, and presentation layers in web
development
Expert understanding of web architecture
Demonstrated competence avoiding security issues related to web applications
Demonstrated competence with software design patterns and how to leverage
Working knowledge of 508 / WCAG
Experience working with and mentoring junior developers within an Agile environment
Experience developing enterprise business applications
Strong work ethic with a relaxed attitude needed
Job Features
Job Category | Developer |
Onsite - Location: NY/LA
Soft Skill:
• Discuss with Client to understand requirements &
communicate the same to Project team
• Coordinate with different stake holders like Support Teams,
Infra teams, Business users etc
• Create Project Delivery, Execution Plan to meet client
requirements
• Create & Review System Architecture
• Suggest solution, design & implementation approach
• Assist in Creating Resource Plan along with task allocation to
have balance responsibilities in order to meet delivery
timelines smoothly
• Work on Weekly Governance to ensure health of Project,
Delivery Challenges, Risk & Mitigation Plan
• Assist UAT with users on the Requirements to ensure the final
product is as per expectations (if required)
• Assist in Production Deployments
• Team Management
• Team Grooming as required
Tech Skills:
• Bachelor of Engineering in Computer Science, Software
Engineering or a related field and a minimum of 8 years to 10
years of Software Development
• Good technical expertise in data warehousing and analytics
related technologies including data lakes, ETL/ELT, data
modeling and, data visualization using AWS
• Experience in designing AWS cloud architecture and
operationalizing solutions around monitoring, alerting, logging
etc.
• In depth knowledge of key cloud services for data integration,
BI and processing services including but not limited to AWS
EMR, AWS Data Pipeline, AWS Glue, AWS CloudSearch,
Athena, AWS Lambda and AWS Kinesis.
• Good Knowledge of Snowflake and its services
• Experience with Basic Python
• Experience of enabling DevOps automation for AWS with
appropriate security and privacy considerations
• Expertise in developing ETL workflows comprising complex
transformations like SCD, deduplications, aggregations, etc.
• Strong SQL background
• Experience with ETL tools like Airflow and/or will be an added
advantage
• Good in written & oral Communication
• Strong on Analytical skill
Job Features
Job Category | Project Manager |
Work with large data sets and libraries of Hadoop ecosystem
such as Spark, HDFS, YARN, Hive, Impala and Oozie (7+ Years).
Functional and/or object-oriented programming language using
Python (Java and Scala - Preferred)
Multi-threaded applications; Concurrency, Parallelism, Locking
Strategies and Merging Datasets
Solid understanding of SQL, Relational and NoSQL databases
Solid understanding (3+ years) in creating and consuming RESTful services
Solid understanding (5+ years) in Memory Management, Garbage Collection & Performance
Tuning
(5+ years) of experience and working knowledge of distributed/cluster computing concepts
Solid understanding (5+ years) experience in Linux environments; strong knowledge of shell
scripting and file systems
Knowledge of CI tools and build tools like Git, Maven, SBT, Jenkins, and Artifactory/Nexus
Self-managed and results-oriented with sense of ownership is required
Excellent analytical, debugging and problem-solving skills is required
Job Features
Job Category | Developer |
location: NY/NJ preferred (Currently remote).
Required:
Sound Knowledge of Snowflake Architecture and Its various utilities
Snowflake User Admin activities like User Creation and Access.
Optimizations where applicable
Coordination with offshore
Preferred Background
Bachelor of Engineering in Computer Science, Software Engineering or a related field and a minimum of 4 to 8 years of Software Development
Good Hands on with Data Modelling (at least 3 yrs)
Good Hands on with ERWin
Strong SQL background
Good in written/oral Communication and analytical skills
Job Features
Job Category | Architect |
Currently Remote, after COVID expected to be onsite in PA
Strong exposure to Data Warehousing Concepts, Practices, Procedures and Performance optimization techniques
Strong Data profiling, Data Analysis and Data modeling skills
Strong troubleshooting skills and experience in leading operational activities
Responsible for designing DataStage solutions using parameters, variables, procedures, and pre/post session commands/shell scripts with industry best practices
Should have played a Team Lead & Data Architect role
Required:
Solid understanding (5+ years) in Scala, Java, JSON, XML
(5+ years) Experience working with large data sets and pipelines using tools and libraries
of Hadoop ecosystem such as Spark, HDFS, YARN, Hive and Oozie
(5+ years) of experience and working knowledge of distributed/cluster computing concepts.
Solid understanding of SQL, relational and nosql databases
Solid understanding (5+ years) in multi-threaded applications; Concurrency, Parallelism,
Locking Strategies and Merging Datasets. * MUST: Solid understanding (5+ years) in Memory
Management, Garbage Collection & Performance Tuning.
(5+ years) experience in Linux environments; strong knowledge of
shell scripting and file systems.
Experience working in cloud based environment like AWS.
Knowledge of CI tools like Git, Maven, SBT, Jenkins, and Artifactory/Nexus
Self-managed and results-oriented with sense of ownership is required
Excellent analytical, debugging and problem solving skills is required
Experience with Agile/Scrum development methodologies a plus
Minimum Bachelors degree in CS or equivalent with 8 - 10 years industry experience
Must have skills:
Experience 13 - 15 yrs.
• B.S., M.S. or Ph.D. degree in computer science or a related field or
equivalent work experience.
• 3+ Years of solid professional coding experience writing production
quality code, preferably Scala (Spark) & Python (PySpark).
• In-depth knowledge of distributed systems, MapReduce, Hive, Tez, Spark
and Kafka internals.
• 2+ years of experience working on complex distributed systems, or
any data processing and data management systems leveraging Spark, Flink
and Kafka.
• Experience working with public cloud platforms, preferably AWS.
• Bonus points if experience working with EMR, Databricks and Snowflake.
Nice to have skills:
• Working knowledge of open-source ML frameworks and end-to-end model
development life cycle.
• Previous working experience with running containers (Docker/LXC) in a
environment using one of the container orchestration services
(Kubernetes, Docker Swarm, AWS ECS, AWS EKS).
Location:
There are multiple positions with these options:
- Remote
- Hybrid: Onsite few days in NJ/NY (Local to NY/NJ preferred)
Must have skills:
• B.S., M.S. or Ph.D. degree in computer science or a related field or
equivalent work experience.
• 3+ Years of solid professional coding experience writing production
quality code, preferably Scala (Spark) & Python (PySpark).
• In-depth knowledge of distributed systems, MapReduce, Hive, Tez, Spark
and Kafka internals.
• 2+ years of experience working on complex distributed systems, or
any data processing and data management systems leveraging Spark, Flink
and Kafka.
• Experience working with public cloud platforms, preferably AWS.
• Bonus points if experience working with EMR, Databricks and Snowflake.
Nice to have skills:
• Working knowledge of open-source ML frameworks and end-to-end model
development life cycle.
• Previous working experience with running containers (Docker/LXC) in a
environment using one of the container orchestration services
(Kubernetes, Docker Swarm, AWS ECS, AWS EKS).
Location:
There are multiple positions with these options:
- Remote
- Hybrid: Onsite few days in NJ/NY (Local to NY/NJ preferred)
Sr. Hadoop Admin
4-5+ years in Managing and Supporting large scale Production Hadoop
environments (configuration management, monitoring, and performance tuning)
Any of theHadoop distributions (Apache, Hortonworks, Cloudera, MapR, IBM BigInsights, Pivotal HD) is a
requirement.
Experience should include Hadoop cluster architectural design, installation, configuration,
proactive monitoring, and performance tuning of any of the Hadoop distributions.
Experience in Hadoop software installation, upgrades, system maintenance, application
development troubleshooting, Linux OS and Hadoop (Apache Ranger) security design and
implementation, data management, Hadoop configuration, process scheduling, and job
execution setup and troubleshooting.
4-6+ years of experience performing administrative activities like managing HBase snapshots,
implementing hotfixes, high availability feature implementations (e.g. rack awareness, HMS HA,
etc.), and Backup, Archival and Recovery (BAR).
Must be proficient in Hadoop storage design, capacity planning, optimization, and
implementation.
4-6+ years of experience in the implementation and ongoing management of proactive
system/application monitoring solutions for any Hadoop distribution.
Strong experience in Scripting Languages (Bash, Spark, SQL, Python). Must be proficient in
Unix/Linux shell scripting.
Hands-on experience in Linux administrative activities on RHEL/CentOS.
Proficiency in setting up and integrating NiFi and Hadoop ecosystem tools – Hbase, Hive
internals (including HCatalog), Pig, Sqoop, Spark, YARN, Zookeeper, and Flume/Kafka, etc. is a
must.
Strong hands-on experience in configuring and troubleshooting Kerberos (KMS/KTS).
Automation experience in CI/CD (Continuous Integration/Deployment) Jenkins, Ansible,
Terraform, Puppet, Chef.
Implementing standards and best practices to manage and support data platforms as per
distribution.
Plan for and support hardware and software installation and upgrades.
Experience in MySQL & PostgreSQL databases.
Team player and work under minimal direction
Preferred:
Experience with Hortonworks HDP and Cloudera CDP Hadoop distributions a plus.
Experience with DR (Disaster Recovery) strategies and principles.
Development or administration on NoSQL technologies like Hbase, MongoDB, Cassandra,
Accumulo, etc.
Microsoft SQL Server database experience a plus.
Development or administration on cloud platforms like Azure Databricks, Amazon Redshift, etc.
Development/scripting experience on configuration management and provisioning tools (e.g.
Puppet, Chef, etc.)
Web/Application Server & SOA administration (Tomcat, JBoss, etc.)
Development, Implementation or deployment experience on the Hadoop ecosystem (HDFS,
MapReduce, Hive, Hbase)
Experience on any one of the following will be an added advantage:
Hadoop integration with large scale distributed data platforms like Teradata, Teradata
Aster, Vertica, Greenplum, Netezza, DB2, Oracle, etc.
Proficiency with at least one of the following: Java, Python, Perl, Spark or Web-related
development
Knowledge of Business Intelligence and/or Data Integration (ETL) operations delivery
techniques, processes, methodologies
Exposure to tools data acquisition, transformation & integration tools like Talend, Informatica,
etc. & BI tools like Tableau, Pentaho, etc.
Job Features
Job Category | Developer |
Domo Developer. Hands on experience.
BI Reporting tools (Including DOMO)
BI: KPIs, Scorecards Dashboards
General ETL and Data handling experience
Advanced SQL
Basic Shell and Python scripting
Media and Marketing domain experience preferred
Ability to create dashboards which include quantitative and qualitative analysis of data
Team player
Job Features
Job Category | Developer |
Bachelor’s degree - Math, Science, Computer Science, or Business preferred or
equivalent work experience.
5+ years of experience analyzing data and understanding data ecosystems in a retail or
comparable environment.
SQL and data verification
Minimum 1 year of designing and implementing a fully operational production grade
large scale data solution on Snowflake Data Warehouse.
Snow pipe experience, ETL experience, migrate data to Snowflake
hands on experience designing and implementing production grade data warehousing
solutions on large scale data technologies such as Teradata, Oracle or DB2
Expertise and excellent understanding of Snowflake Internals and integration of
Snowflake with other data processing and reporting
Understanding of Agile/Scrum frameworks and Waterfall methodology
Should be customer-oriented, mission focused, and a quick, continuous learner
Excellent verbal and written communication skills
Ability to communicate effectively between business and technical teams
Ability to work with and through others to resolve issues
Proven ability to work under tight deadlines and the ability to deal with ambiguity
Experienced oral and written communicator with good interpersonal skills
Positive attitude and solid work ethic
Job Features
Job Category | Developer |
Junior Java developer with Operations Support experience:
Core Java, Collections, Multi-threading as core skillset with knowledge of databases and design patterns, and some hands-on experience on UNIX.
Hands on coding experience.
The candidate needs to be a self-starter, a confident personality with good communication skills who can work with minimum oversight and handholding.
(Preferred) Education: * Bachelor's
(Preferred) Experience: * REST: 1 year
Job Features
Job Category | Developer |
Work with large data sets and libraries of Hadoop ecosystem
such as Spark, HDFS, YARN, Hive, Impala and Oozie (7+ Years).
Functional and/or object-oriented programming language using
Java and Scala
Multi-threaded applications; Concurrency, Parallelism, Locking
Strategies and Merging Datasets
Solid understanding of SQL, Relational and NoSQL databases
Solid understanding (3+ years) in creating and consuming RESTful services
Solid understanding (5+ years) in Memory Management, Garbage Collection & Performance
Tuning
(5+ years) of experience and working knowledge of distributed/cluster computing concepts
Solid understanding (5+ years) experience in Linux environments; strong knowledge of shell
scripting and file systems
Knowledge of CI tools and build tools like Git, Maven, SBT, Jenkins, and Artifactory/Nexus
Self-managed and results-oriented with sense of ownership is required
Excellent analytical, debugging and problem-solving skills is required