Deloitte’s Cameron Pitt recently (November 2022) gave a talk to an IPAA ACT forum about the “future of Work”.

He made several important points (taken from the Mandarin report on the forum):

  • Government leaders needed to support more interdependent ecosystems that include how departments worked together, shared data, moved resources and delivered services to the community.
  • Pitt said there was also a concerted push to build a so-called “collective intelligence” capability.
  • “Ours is all about making our consultants smarter with the use of AI,” he added.
  • He said project-focused work that saw consultants come and go after a few months, with no cultural investment in outcomes and delivery, would need changing to a more meaningful interaction.
  • “Our clients […] have said to us: ‘It’s an ecosystem, it’s an orchestration, we actually don’t want you to deliver services. We want you to own outcomes for the organisation, and work deeply with us to deliver outcomes not deliver a document or an engagement’.”

He didn’t mention the Four Pieces Limit, but it is an important effect, underlying most of what he said.

Departments can’t work well together when their documents use different meanings for words. A “collective intelligence” will require either a considerable dumbing down, or something which can translate a meaning for one department into a meaning for another.

“… smarter with the use of AI”
Nowadays, the original intention of AI, that of embedding intelligence in a machine, has been largely abandoned. Machine Learning (ML, DL, LLM) is about using a pre-programmed device (an ANN or Artificial Neural Network) to connect inputs to outputs and hope that the weightings will allow the correct response. This has been done because it is much easier to do than trying to build intelligence into a machine – you just use a big block of text as the data. This effectively lives in the past, and cannot respond to new events (or even a newly defined term in a piece of new legislation or a specification). An alternative approach, with a new name, is Artificial General Intelligence (AGI), where the structure much more closely emulates the activity in a real neural network, and by doing so, the path from words to outputs is much more easily understood and believable. Needless to say, if an AI application has no understanding of what the words mean, the Four Pieces Limit is irrelevant.

“He said project-focused work that saw consultants come and go after a few months …”
We see value in a project which introduces a means of handling the Four Pieces Limit into a department. It requires someone who is familiar with the complexities of language, and who can embed a culture of having everyone on the same page, ensuring that people with objections because they don’t understand are respected and coached, and people who are diffident about putting forward their valid objections are coaxed into contributing. One or two projects, and the organisation should be able to carry on unassisted.

‘… work deeply with us to deliver outcomes, not deliver a document or an engagement’
Getting down into the details of how words work really is “working deeply”. It won’t be everyone’s cup of tea, but if it can fend off billions wasted because a specification was misunderstood, or legislation that doesn’t do what was intended, it will be worth it. If all it can do is make for an easier working environment, knowing a machine “has your back” in keeping the document “alive” so confusion is minimised, it will be worth it.