Building data pipelines and delivering data with Python

  • kedro
  • pandas, parquet, s3, oracle, db2,

API development with Python

  • react / gatsby
  • c3.js
  • built a library to go from pandas to json and into c3
  • api
  • flask flask_restful

SQL skills

  • pea

Cloud knowledge (preferably AWS + Snowflake)

  • batch
  • snowflake


  • New 3600 rods being released just before capital ordered for reman

  • mainframe 443

    • security
    • fighting for the correct solution
    • rug gets pulled out
  • core forecast was unusably slow and not useful for data exploration

    • built a quick gatsby site with a python/hug backend
    • each part went from about 30s render time to under 100ms
    • served it's purpose for what it was needed, tableu was still used for the full report
  • quickly reduce back orders on parts

    • Setup an email alerting system reccommending when customers have reman available, but are waiting for a new part on back order
    • Get Customers parts faster
    • prototype up in under 24hrs
  • Reduce CVE's on ~60 pipelines

  • update and maintain ~60 pipelines

  • Markata

    • created my own web framework to learn pluggy and diskcache
    • has taught me a lot about performance, re-usability, and library development
  • Whitehat Warranty

    • big dataset
    • wimpy server to run
    • setup a download tool
    • setup rules to detect fraud engines


Why Netflix

I use it every day and admire the product. The engineers that I have interacted with online have been nothing short of incredible with lots of knowledge to share.

This role is my dream next role. What I am doing right now feels very in line with what this req is for. I have a background in ME -> Data Science with very little COE support, I know the pains that this can bring


  • Remote
  • Travel
  • What do the teams I interface with do?
  • What is the current status of the teams I interface with?
  • Big initiative on climate, coming from a reman role, what do you do for decommissioned hardware, or preventing e waste?


SEAL software engineering accelerate and lifting

  • smart pay, simplify payments for small businesses

  • invoices

  • integrated payable

  • create a report

  • automatically reccomend best payment method

  • released mvp last august with one customer

  • scale to 100k $1B transaction

  • Quann's team is responsible for payment decisions

  • monthly movement team

  • using ccp commercial card platform wtih many millions

  • Aws platform from scratch

  • focus on serverless

  • docker and fargate ok

  • realtime

  • gerekin bdd

  • pytest

John with insight

Cooper works directly with hiring managers Pratt and whitney 90 minutes

1-1:45 pm central - resume George, Rafal

  • heavy use of Django
  • react

3:15 pm

  • live code the thing

  • 30 minute on Tuesday with Scott Michael the director

Keten Vyas with Integrated Resources

Yes, that's correct. The pay rate for this role would be between $60/hr. - $80/hr. on W2 without any benefits.

Job Title: Senior Data Engineer Job Location: 100% Remote Role Job Duration: 03+ months contract on W2 (high possibility of extension)

Job Description: Assigned to projects under the management of the client Office of Ambulatory Care and Population Health, and under the direction of the relevant program leads for each discrete initiative, the Senior Data Engineer will organize, analyze, and communicate data as well as building technology solutions, in order to support and inform programmatic and operational efforts of Client programming. May be assigned to programs such as Humanitarian Emergency Respite Center (HERC) facilities, which offer direct service provision, resource navigation, and temporary shelter to single adults, adult families, and families with children. Multiple roles exist within this title, and individual staff will be assigned to the best fitting role based on experience. Specific responsibilities may include:

Summary of Duties and Responsibilities:

Building and maintaining the internal data infrastructure to be used by data analytics staff and developing and ensuring standardization in data cleaning, programming, and analyses across large, complex data sets Structuring and transforming datasets for multiple complex use-cases, user stories and related requirements within assigned projects/programs Building back-end data migration tools and infrastructure to support data analytics work Performing in-depth investigation and analysis to identify and resolve complex processing problems associated with the systems, programs, and datasets utilized by the projects/programs Recommending programs, practices, and standards to facilitate uniform application of electronic data methods, code versioning and review, ticket management If applicable, supervising and overseeing additional data engineering staff, providing assistance and insights on complex and difficult data sets and ensuring accuracy of work products. Providing advice to front-end developers and field staff on overall data intake and integrations


Knowledge of Tableau preferred Expertise in Python or equivalent programming language for automation preferred Advanced knowledge of SQL preferred Experience with API interfacing in a data engineering and analytics environment preferred

Preferred Skills:

Ability to work autonomously, think analytically, and anticipate data issues to solve before they arise Excellent written and verbal communication skills, with the ability to explain data systems to non-technical teams Strong quality control abilities and exceptional attention to detail Ability to manage multiple complex projects at a time, prioritize, and execute on tight timelines

Kim Kilgoar with Tier4 group


Are you a Lead Data Analyst professional who excels in designing, implementing, and managing mature enterprise level data management capabilities and technologies?

Serving as a Lead Data Analyst you are responsible for translating the high-level solution design into application software detailed design artifacts and ensuring they are maintained and reviewed throughout the project life cycle, while ensuring the detailed design and implementation of the application meets the functional and non-functional requirements. You ae also responsible for ensuring the appropriate application components are installed and configured to meet the requirements for all environments (development, test, stage, and production) and Source Code management. You will work directly with the following data management capabilities and technologies relating to the following disciplines: metadata management, data governance, master data management, data quality, data access and authorization.

This is a 1+ year contract opportunity that is fully remote with competitive compensation. This opportunity also comes with health/vision/dental benefits, a flexible spending account, life insurance, 401k, and more. You will also have the opportunity to grow and work side by side with one of the best Fortune 100 companies in the nation!


Bachelor’s degree in a related field 6+ years experience with data management 4+ years experience with: DB2 Platform Testing Data Needs SQL Experience

Experience with Node.js or Javascript preferred (not required, but desire to learn is!) Experience architecting and designing data management solutions to meet enterprise-level goals and objectives Experience with Agile and Dev Ops environments Experience with industry-standard data cataloging, data quality, and/or data modeling technologies

If you are seeking an opportunity to advance your career as a Data Management Lead with an innovative, diverse, community-driven, growing Fortune 100 company, please apply for immediate consideration.


  • all api's are in python - majority
  • Infrastructure to be built using typescript
  • Agile workflow
  • Lamda
  • ApiGateway
  • Dynamo
  • AWS (4) 1 lead from from Tek and 6 developers

SRE Centene

Garrick Smigelski

  • fortune 25
  • #2 heathcare

Clayton Baltamore


Dice - LinkedIn Snowflake flask

Senior site reliability engineer - snow

Centene corp

Fortune 24 medicare/medicade Garick

Dan Winn -


Seth - Configuration backend - fastapi Alex - front-end Shannon - from the beginning - services/infrastructure - microservices - event centric JoAnn - Product owner, 1 yr, work closely with sales and implementation