Greetings from Roll the Bones! Here we muse about complexity, learning, epistemology, accomplishing goals in complex environments, and whatever else I might be in the mood to discuss.
One thing I want to drive home is that life is not a game. The only simple, widely-understandable rules it follows are given by God Himself. In all other aspects, things are often more complex than they appear.
This Week’s Links
There are no affiliate links here, just things I’ve been reading. None of the authors have any idea their work is about to be featured.
The Social Limits of Scientific Certainty
James Evans talks about how our teams, social groups, etc. affect scientific and scholarly processes. In other words, how does science as a system work? Emergence, complexity, etc. come into play, as well as some worrisome results about the effects of greater connectivity. The video is an easy watch, and James is a decent presenter.
My thoughts:
Clustered tastes often mean clustered epistemologies. This means there is a lot of “inertia” for prevailing theories and lines of thought. There is peer pressure to find what agrees with what your social group already believes to be true. James calls this “canonical crystallization”.
He gives an example 10 supporting studies, but only 2 independent approaches, or only 4 independent groups. There are a lot of confounding variables in what we might call “scientific consensus”.
There’s an analogy made that technology is “fragile” - it works in specific contexts, while science should be “robust”, or more generalizable to different contexts. I would suggest that a lot of what has people concerned about replication etc. is due to “fragile science”.
“Findings from decentralized communities are much more likely to replicate”. If true, then centralization of science efforts leads to fragile results. That said, if we are to decentralize then we must become more comfortable with failed experiments.
Explore vs exploit
Emergent Interaction in HCI
This paper is a proposal to start a workshop focusing on a complexity science approach to Human-Computer Interaction (HCI). The workshop website can be found here. The intent is to study HCI in the context of interactions between components and the phenomena that result, rather than the more common method of looking at things individually. This continues a trend I’ve observed over the past several years of complexity getting into all the other sciences “the way cranberry juice gets in all the other juices at the grocery store”.
My thoughts:
As discussed in the Metascience article, additional workshops, collaborations, and connections can lead to more predictability and fragile outcomes. However, strong improvements and novel results were also found to come more frequently when “aliens” from one field invade another. Complexity + HCI is an opportunity to mix disciplines and see if something new emerges from the clash of cultures, traditions, and priors. A new methodology is probably worth the risk.
HCI is related to ergonomics and to cybernetics. Ultimately, this is about humans interacting with tools that are highly powerful and dynamic. This is also seen in related fields such as Cognitive Systems Engineering and Resilience Engineering. All signs point to considering humans as part of larger systems and trying to understand how to accomplish our goals within them. This mix seems a natural fit.
If I were to throw something into the mix, it would be probability theory, information theory, options theory, or extreme value theory. Uncertainty is a powerful force, perhaps especially in contexts where we are overflowing with data and need to evaluate/filter it and, more importantly, act.
Yes, this is another article with ties to previous works/labs I have shared in this newsletter. I continue to search for diverse perspectives and work, but I am a living example of that crystallization effect mentioned previously.
Complexity: Learning to Muddle Through
John Flach takes a stab at defining complexity: interdependence and dimensionality. He breaks the reasoning down into digestable chunks, acknowledging that there are many definitions of complexity and that the most common one is really more like “difficulty”. The implications are interesting.
My thoughts:
Control is described as “destroying variety” in the sense of Ashby’s Law of Requisite Variety. You are reducing the possible states of the system.
As the pace of change in a system increases, the harder it is to have central digestion of data and decision making. More decision making must be left to those right there where things are happening due to this bottleneck.
Most modern work environments and systems in nature are intractable - changing, interdependent, highly dimensional, an abundance of details, etc.
A lot of research is done on systems with low interdependence or dimensionality, and then extrapolated/generalized broadly. We should sample the “complexity” space more broadly to understand how the patterns observed change as the system size/complexity shifts.
Control and observation is a balancing act. It’s the same one as explore vs exploit in Reinforcement Learning and other disciplines. Interestingly, in many environments the observation and/or control must be distributed among many agents.
Thank you for reading and engaging.
I appreciate you taking the time to read this newsletter. It’s free, but if you want to help support me you can always make a one time donation.
Engage with me on Twitter at @10101Lund about these or any other topics. If you find an error in this newsletter, please let me know and I’ll correct it. Archives of this newsletter are here.
Tell a friend. See you next issue!