• Korhaka@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    7
    ·
    14 hours ago

    Dumping databases is such a small brain move. That costs what, half a departments overtime for a few days? Nothing.

    Deploying consistently poor quality code costs far more. That can double the size of the support department and keep them going for years. Costing far more to the organisation.

    • notabot@piefed.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 hour ago

      Yup, don’t dump the database, just shuffle the primary keys on important tables. Systems keep running, so it takes longer to work out what’s wrong, and the data gets even more screwed up with every passing transaction. You very rapidly end up with a basically unrecoverable mess if you’re also talking to external systems, as none of their data will match yours anymore, even if you do try to recover from an older backup.