Skip to content
Brian Newman
← Back to Blog
6 min read

My Python Data Stack in 2026

The stack I keep coming back to for data engineering, APIs, and full-stack work, and why I still like boring, dependable tools.

  • python
  • fastapi
  • databricks
  • dbt
  • prefect
  • react

I spend a lot of time thinking about tools, but I do not really want my stack to be exciting. I want it to be dependable.

That is probably the easiest way to describe the stack I keep coming back to. I like tools that are practical, composable, and easy to reason about. I want strong foundations, good typing, clear deployment patterns, and enough flexibility to build internal platforms, APIs, automation, and data workflows without fighting the framework every day.

Right now, my core stack looks like this:

  • Python
  • FastAPI
  • React
  • Databricks
  • dbt
  • Prefect
  • PostgreSQL
  • Tailwind CSS

That mix has ended up being a really good balance between application development, data engineering, and operational tooling.

Python is still the center of gravity

Python is still the language I reach for first.

It is not because Python is perfect. It is because Python lets me move across domains without changing how I think. I can build APIs, orchestrate jobs, clean data, write automation, call external systems, and stand up prototypes quickly without context switching into a completely different ecosystem.

That matters more than people sometimes admit.

A lot of teams end up with one language for data work, another for backend systems, another for scripts, and a fourth for one-off integrations. It can work, but it also creates more seams, more handoffs, and more places for quality to drift. Python gives me one strong default for a lot of those jobs.

I also like the maturity of the ecosystem. Between typing, Pydantic, modern ASGI tooling, and strong libraries around HTTP, data processing, and automation, Python feels a lot more complete for production work than people sometimes give it credit for.

FastAPI fits the way I like to build

FastAPI has been a great fit because it is straightforward.

I like that the request and response models can be explicit. I like that validation is built into the shape of the application. I like that it does not feel overly magical. Most of all, I like that it works equally well for clean APIs, internal tools, and systems that need to grow over time.

For me, FastAPI sits in a sweet spot:

  • fast enough
  • typed enough
  • modern enough
  • simple enough

That is a hard balance to hit.

When I am building internal platforms or workflow systems, I want the backend to be the source of truth. I want the backend to define what actions are allowed, what state a record is in, what fields are editable, and what the next step is. FastAPI supports that style really well.

React is still the right frontend when the UI matters

On the frontend side, I still like React.

There are lighter options, and I think that is healthy. Not every interface needs a big frontend framework. But when I want something polished, component-driven, and easy to evolve, React is still the easiest choice for me.

I especially like using React when:

  • the UI is stateful
  • the app will grow over time
  • roles and permissions matter
  • reusable components matter
  • layout and interaction quality actually matter

For more content-first sites, I am happy to simplify and lean into server rendering. But for applications, dashboards, and more complex workflow interfaces, React still makes a lot of sense.

Databricks, dbt, and Prefect work well together

On the data side, I tend to think in layers.

I want ingestion to be reliable. I want transformation logic to be explicit. I want orchestration to be understandable. I want monitoring and retries to exist. I want the data platform to feel like a system, not a collection of notebooks and good intentions.

That is where Databricks, dbt, and Prefect fit really well together.

Databricks

Databricks gives a strong execution environment for heavier data workloads and a solid home for lakehouse-style patterns.

dbt

dbt gives structure to transformation logic. That matters a lot. Once a team starts standardizing naming, tests, lineage, and model definitions, everything gets easier to reason about.

Prefect

Prefect is the layer I like for orchestration because it lets me treat workflows like software. I can define retries, state transitions, parameters, logging, and deployment patterns in code. That is a much better fit for the way I think than a pile of disconnected scheduled jobs.

PostgreSQL stays in the picture more often than people expect

I am also still a big believer in PostgreSQL.

A lot of systems do not need something exotic on day one. They need something durable, understandable, and flexible. PostgreSQL keeps showing up because it solves a lot of real problems without turning into a science project.

It works well for application data, metadata, queues, content, and smaller platform workloads. It is also one of those tools that gets even more useful the better you understand it.

I prefer boring where it counts

One pattern in my stack is that I do not chase novelty in foundational layers.

I want boring in the places that matter:

  • request handling
  • validation
  • storage
  • deployment
  • job orchestration
  • content rendering

That does not mean I avoid new tools. It means I want new tools to earn their way in.

I do not mind experimenting around the edges. I do mind introducing unnecessary complexity into the center of the system.

The stack matters less than the shape of the system

This is probably the bigger point.

The stack matters, but architecture matters more.

A good stack with weak boundaries still becomes messy. A modern toolchain without conventions still drifts. A beautiful framework still creates pain if ownership is unclear and the system model is fuzzy.

What I care about most is building systems where:

  • responsibilities are clear
  • data has a path
  • state is explicit
  • deployment is repeatable
  • the next engineer can actually understand what is going on

That is the standard I try to hold.

Final thought

I still like modern tools. I just want them to help me build more clearly, not more cleverly.

That is why I keep coming back to Python, FastAPI, React, Databricks, dbt, Prefect, and PostgreSQL. It is not the flashiest stack in the world. It is just a solid one.

And honestly, that is usually what wins.