Jump to content
I Forge Iron

Automating disasters in seven easy steps


Recommended Posts

I must be getting older because I'm starting to notice second and third generations of the same bad business ideas that failed before.  Although the marketing is different, the central line of thinking is the same.  Hopefully I'll touch on something that helps others to identify and avoid these mistakes.

Step one; We're not successful enough right now so we have to radically change our processes.    OK, new approach time. Let's start by throwing out all of our "best practices".  Lets huddle up and talk about innovation, synergy, silos, LEAN, Eatwhatchukill, etc.

Step two: Build a model.  Now a well-behaved model is going to give you the answer you wanted from the beginning.  So naturally, you wanna build the model so that it begins with the conclusion, then generates semi-plausible data points to justify it.  Bonus points if the data points are based on extrapolations from consensus driven experts.  In my industry, this would be reference materials like the RS Means series.  

Step three: Focus on adjustments to your model.  If your model really did generate the correct answer every time, well, there'd be no point in paying you to stick around right?  Naturally you're going to recommend that the model requires someone with a deep understanding of what's going on to finesse the best possible answers.

Step four: This is going to take a lot of time away from pesky stuff like best practices.  Plus, the results of the time-honored approach might contradict our model.  This is the way of the future, we can't march in two directions.  You can always spot a rookie because they can't say that last part with a straight face.

Step five: Now that we're in the future and nobody's doing things the old way, we have to adjust our thinking.  This means that we're free to conjure up whatever mechanisms we might need to explain why reality refuses to admit that our model was right.  If you stumble onto a particularly handy mythical mechanism, you might even be asked to generate a model to track that!

Step six: Stockpile information.  Any truly efficient model will convert the input into mountains of information.  Show'em the math and they'll surely recognize what a scientific venture this really is.

Step seven: This is really the best part.  See the model isn't human so it doesn't make mistakes.  The people inputting the information didn't make mistakes either.  They were just working with what they were given.  Now obviously everyone is doing their very best so it's just an unfortunate consequence that the new normal will be slightly delayed.  Of course with every iteration of this cycle, we get more accustomed to the notion that success is relative.  Feel free to tell your boss that every step takes you halfway to your destination, which is a really clever way of saying that you'll never get there.

So that's it, seven easy steps to automate disaster!  I've seen multinational firms that did this kind of thing as often as I've seen it with sole proprietorship's.  I'm guessing that this secret seven step plan has laid down roots beyond the construction industry, but I don't know that for certain.  

 

Link to comment
Share on other sites

Where is the step where the model is adjusted to ignore it's own results proving it can't work? The particular model I'm thinking of proved it's not possible to model the problem in any predictable way but spawned a whole new science. Chaos theory by name. 

When I was drilling test holes the guys on the crew learned to hide our expressions when some bridge "designer" was just having us confirm what he already knew, he'd flown over and got a look from the air. One career challenged individual thought he could threaten the ground into admitting he was right and was aghast when the Materials engineer and road commissioner asked for his resignation "on grounds." I loved that they used the alternate term "On Grounds" instead of "For Cause." I'm sure it was NOT to twist the knife a little. 

Frosty The Lucky.

 

Link to comment
Share on other sites

I have shown stress analysts SEM images, or metallographic micrographs, or other data with clear evidence of a root cause of failure.  I would explain what is proven, what is ruled out, and what may still be suspected but not known, along with the implications.  Sometimes that drove positive solutions to problems. Sometimes they would argue that we are not looking at what we are clearly looking at, "because the model says..." 

:D

Link to comment
Share on other sites

I think this may also explain how 3rd party logistics companies have gained a foothold.  They decrease efficiency, they create extra work and expense for everyone involved, and pretty much no one who deals with them directly has a positive opinion of them (that I've talked to anyway), but in *theory* they decrease costs by increasing competition.  Of course when you ignore the importance of longstanding relationships and overall service, frequently the "cheap" option ends up being the most expensive option.  If you can pull up a nifty screen on your computer that shows a bunch of information in real time it must be better though, right?

Link to comment
Share on other sites

Although the quote itself is not Forum Friendly, Samuel Clemmons once stated something about lies and statistics. 

Just what DO you want the numbers to say? 

Of course, the problem isn't using data to make decisions. The problem is not understanding what the data can reasonably tell you and with what confidence, but then using the available data to hide that ignorance and justify -- or excuse -- poor decisions based on useless or even intentionality misleading data. 

Link to comment
Share on other sites

I think that a major part of the problem is the result of an emphasis on innovation.  Everyone wants to do something new, whether it is better than the old or not.  No one gets recognized or rewarded for doing things the same old way.  Innovation is rewarded, maintaining the status quo is not.  Unfortunately, improved results are not the measurement of success, new ideas, models, and "fresh insight" are the goal.  Sometimes new ideas and approaches are good and result in a better way of doing things but they are usually incremental, not broad and sweeping.  

Also, with new innovations you get to use new buzz words and phrases. Rockstar, you forgot forming a tiger team to scope the paradigm.  And no one wants to admit that they don't know or understand what the new buzz word of phrase means.

Sometimes it feels like you are living in a Dilbert cartoon.

"By hammer and hand all arts do stand."

Link to comment
Share on other sites

Yep ... and when everything else fails ... trade in bitcoins! Now there is an innovative initiative. 

Actually I liked the chain letters and pyramid schemes... aah the memories :)

Quick or you miss out !!!

Link to comment
Share on other sites

One of Lisa's friends approached her about some vitamin supplement he was selling that was supposed to work wonders for autistic kids like our son. He sent us a whole page of links to peer-reviewed scientific studies that supposedly proved the purported benefits, little imagining that I would actually read them all. Long story short: many studies indicating a possible effect in a specific part of the brain, BUT (1) almost all were in rats, (2) the only one in humans was inconclusive, and (3) there's no good science demonstrating that that part of the brain controls autism anyway. So, thanks, but no thanks.

The moral of the story is that data in itself isn't good or bad, but that we have to be very careful about understanding how well it does or doesn't answer the questions we're asking* and also about whether data is being presented to us to clarify or to confuse. Pace Mr. Clemens, statistics themselves are not intrinsically false, but they can be used very effectively to hide the truth and present falsehoods if we don't look at them closely.

 

*We also need to be careful about understanding whether or not the question we're asking is the one we should be asking.

Link to comment
Share on other sites

On 7/21/2020 at 5:23 PM, Frosty said:

Where is the step where the model is adjusted to ignore it's own results proving it can't work? 

That would be step five!

 

On 7/21/2020 at 5:25 PM, ThomasPowers said:

I've always found it interesting when the "model"  is considered to be more accurate than actual data.

In many cases, the people backing the model see this as feature, not a bug.

 

On 7/22/2020 at 12:26 AM, George N. M. said:

Rockstar, you forgot forming a tiger team to scope the paradigm. 

George, that's comedy gold and I will put it to use at my first opportunity!  Also, I think innovation is less the attractant than the opportunity models create to evade culpability.  Everyone can pinky swear to their wonderful intentions as the project unravels.  I almost forgot the other aspect of these models that's like catnip to modern business culture.  Collaboration!  Now everybody get's to be seen participating in a consensus-driven thing without all the pesky work, accountability, and "old fashioned" standards like deadlines, or budgets.

 

JHCC,

Good points as always.  Your comments remind me of an article I read a while back about what goes on in academic publishing.  First off, it's incredibly self-referential.  Citing other published works is done so frequently, that it's almost impossible to verify that the cited works (and the works they cite) actually stand up to scrutiny.  Second, virtually nobody actually reads the scientific papers that get published.  This was proven a few times by folks who filled out the center of their work with stuff like the script from Star Wars.  More alarming than all of that, is this fundamental shift in the scientific method towards grant funded publications.  Rather than testing their hypothesis with an experiment, researchers are intentionally creating experiments that generate a huge volume of data points.  Most of it is noise, but that's ignored because the next step is to cull the database for information that supports their hypothesis.  Technically, they do have supporting data, however it may bear little resemblance to the bigger picture.  This is where things really get dicey.  Peer review is supposed to include tests for repeatability.  Some hot new publication comes out and everyone jumps on board, citing that work in their research, but nobody can actually repeat the results of the original experiment.  Incredible amounts of time, money, and human capital are getting wasted on dead ends because the scientific method has been abandoned.

Link to comment
Share on other sites

What George N.M said!   Nobody gets their Doctorate proving that the way people have been doing things works the best!  (I see this a lot in Education where the "new" may not work as well as the "tried and true"  and one size fits all NEVER does!)

Back in the '60's I participated in an experimental school set up in 5th & 6th grade---learn at your own pace.  Well we did a great job proving that where the student was interested they often were several grade levels ahead and where they were not; they lagged a grade level.  Funny thing we moved twice after that and at my new high school they were "recovering" from the exact same experiment.  Most math classes had to teach the previous year's math as well as the math expected for their year.    Guess nobody examined the data from before.

Now noise in data does not mean there is a problem. There are ways of filtering.  A lot of the seminal work in things like physics had really lousy data that they were able to use to create the "modern" world anyway. Of all the experiments I had to do in college in physics; only 1, the photo electric effect did theory and experimental data agree out to 4 decimal places. (Of course data filtering can discard the good stuff just as easily, and then there are artifacts...I knew an astrophysicist once that was complaining about a paper submitted for publication.  The author had *found* ground breaking stuff  right were the instrument used had a sensor crossover and data was supposed to be excluded... My friend knew that instrument and knew that the data was bogus in that range and wondered why the current user didn't! )

Link to comment
Share on other sites

All:

This has appeared any number of places previously but in case anyone has not previously seen the strategies for what to do when you find yourself beating a dead horse here they are:

Buying a stronger whip.

  • Changing riders.
  • Threatening the horse with termination.
  • Appointing a committee to study the horse.
  • Arranging to visit other sites/countries to see how they ride dead horses.
  • Lowering the standards so that dead horses can be included.
  • Appointing an intervention team to re-animate the dead horse.
  • Creating a training session to increase the rider's load share.
  • Re-classifying the dead horse as living-impaired.
  • Changing the form so that it reads: "This horse is not dead."
  • Hiring outside contractors to ride the dead horse.
  • Harnessing several dead horses together for increased speed.
  • Donating the dead horse to a recognized charity, thereby deducting its full original cost.
  • Providing additional funding to increase the horse's performance.
  • Doing a time management study to see if the lighter riders would improve productivity.
  • Declaring that as the dead horse does not have to be fed, it is less costly and therefore contributes substantially more to the bottom line of the economy than do some other live horses.
  • Purchasing an after-market product to make dead horses run faster.
  • Declaring that a dead horse has lower overhead and therefore has a better price/performance factor.
  • Forming a quality focus group to find profitable uses for dead horses.
  • Rewriting the expected performance requirements for horses.
  • ** Promoting the dead horse to a supervisory position. **
  • Applying for a government subsidy to retrain dead horses.
  • Starting a campaign to deregulate the use of dead horses because regulation interferes with innovation and efficiency.
  • Demanding a tax abatement or the company will move its dead horse operations to a more business friendly location.
  • Declaring the work of dead horses as essential for national security thus making it illegal for dead horses to go on strike.
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...