Most automation projects don't fail because the code was bad. They fail because someone decided to automate the wrong thing, and nobody caught it before three months of engineering went into making the wrong thing run faster.

I've seen this pattern across ten years and a lot of organizations. The technical work is rarely the problem. The framing is.

The process you're given isn't the process that exists

When someone hands you a workflow to automate, the document or the diagram you get is the official version. The actual workflow, the one that happens every day, almost always includes steps that aren't on the diagram.

The shadow steps. A spreadsheet someone updates manually because the system doesn't show what they need to see. A Slack message they send before approving the form because they don't trust the form. A weekly meeting where two people reconcile the data because the data is wrong.

If you automate the official version, you get a system that runs alongside the real workflow, doesn't replace it, and adds a maintenance burden on top.

The first job is to find the shadow version. Watch the work happen. Ask "what did you have to do that wasn't in the documentation?" Until you've seen the steps the team takes that no one wrote down, you don't know what you're automating.

Automating a broken process makes it broken faster

If the process is misshaped, automation amplifies that. A weekly approval that should be a daily approval gets a beautiful workflow engine and runs weekly forever. A duplicative data entry that exists because two systems can't talk gets an automated bridge that calcifies the duplication permanently.

Always ask: should this even exist? Could half of these steps be deleted? Could the form be shorter? Could the approval be implicit? Often the right move is to simplify the process before automating, and sometimes the simplification removes the need to automate at all.

People resist automation that takes away judgment

If a step in the process exists because someone on the team is exercising judgment, and you replace that step with a rule, you will lose the judgment and you will not regain it through better rules.

The person doing the work was probably catching edge cases the rule won't catch. The thing they were doing was load-bearing in ways that aren't visible until you stop doing it.

Two ways to handle this. Either keep the human in the loop for the judgment call and automate everything around it. Or talk to the person, find out what they're actually checking, and decide together whether the rule is good enough. Don't decide on their behalf.

Maintenance is the hidden cost

Every automation has an ongoing cost. Things change upstream. APIs deprecate. Schemas drift. The team that uses the automation grows and the workflow evolves. Six months after you ship, someone has to keep this running.

If the automation is owned by no one, it will rot. If the automation is owned by an engineering team that doesn't see the workflow daily, it will rot subtly. The best-cared-for automations are the ones where the people who use them daily can also reach inside them and adjust without asking permission.

Build with that in mind. Use a stack the team can maintain. Document the assumptions. Accept that an automation is a living thing, not a delivered artifact.

Key takeaway

The first three steps of an automation project (shadow the workflow, simplify the workflow, decide what stays human-in-the-loop) get you 80 percent of the value. The code is the last 20.

A workable order of operations

  1. Shadow the actual workflow, including the unofficial steps.
  2. Simplify the workflow on paper. Cut everything you can cut.
  3. Decide what stays human-in-the-loop and what can be a rule.
  4. Build the simplest possible version of the automation that handles the common case.
  5. Plan for who owns it and how they'll change it.

The teams I've seen ship automation that lasts are the ones that took those first three steps seriously. The teams I've seen abandon their own automations a year later are the ones that started by writing code.