Review: Invisibility, MOD museum, Adelaide
Disinformation, algorithms, big data, care work, climate change, cultural knowledge: they can all be invisible.
In her New York Times bestseller, Weapons of Math Destruction (2016), subtitled “how big data increases inequality and threatens democracy”, mathematician and data scientist Cathy O’Neil unpacks the elusive algorithms of our everyday lives and how accurate, fair or biased they might be.
Algorithms hide behind the assumed objectivity of maths, but they very much contain the biases, subjective decisions and cultural frameworks of those who design them. With scant detail on how these algorithms are created, O’Neil describes them as “inscrutable black boxes”.
Opaqueness is intentional.
In one of the upstairs galleries at the spacious MOD, we are greeted in large text as we enter: “what do algorithms think about you?”
Can an algorithm think?, we ask. And, if so, what informs the decisions it makes about us?
Biometric Mirror was created by science fiction artist Lucy McRae and computer scientists Niels Wouters and Frank Vetere. They created an algorithm to judge our personalities by asking 33,000 people to look at photographs of faces and come up with possible personality traits.
Can an algorithm tell you who you really are? Topbunk
We don’t see who the photos are of or who is doing the evaluating – and therefore we don’t know what biases might be reproduced.
You are invited to gaze into a mirror which scans your face. From this scan, the algorithm creates a profile of your age, gender and personality, which appears as statistical data overlaid on your reflection.
When I look into the mirror, I am told I am neither trustworthy nor emotionally stable. The algorithm underneath guesses my age by a few years, and I score highly for intelligence and uncertainty – an unhelpful combination.
Despite my doubts about the algorithm, I notice myself focusing on the more favourable data.
In this context, the data is benign. But facial recognition technology has been used to survey and monitor activists and has been responsible for thousands of inaccurate identifications by police in the UK.
Using data to illuminate cultural knowledge
In one of the more impressive works in the exhibition, contemporary data visualisation is used to illustrate Aboriginal forms of knowing and the intrinsic relationship between spatial awareness, Country and kinship.
Ngapulara Ngarngarnyi Wirra (Our Family Tree) is a collaboration between Angie Abdilla from design agency Old Ways, New, Adnyamathanha and Narungga footballer Adam Goodes and contemporary artist Baden Pailthorpe.
In every AFL game Goodes played, his on-field movements was recorded via satellites, which connected with a tracking device in the back of his jumper. 20 million data points were then fused with data scans of a Red River Gum, or Wirra, to form an impressive data visualisation projected onto two large screens in a darkened gallery.
In Ngapulara Ngarngarnyi Wirra (Our Family Tree), data from Adam Goodes’ football games is returned to Country. Topbunk
Here, Goodes’s data is returned to Country to form part of the roots of the tree as well as the swirling North and South Winds of his ancestors. The data is also translated into sound and amplified, inviting us to listen to what would otherwise be inaudible.
In a small room between the screens – or within the tree – drone footage of the Adnyamathanha Country (Flinders Ranges) plays against the retelling of the creation story in Adnyamathanha language.
What results is the synthesis of traditional Aboriginal knowledge with cutting edge technology, revealing different ways of sensing space and time.
The power of the invisible
While it’s easy to focus on how technology is used and exposed in the works in Invisibility, down the corridors and hanging from the ceiling in MOD are a few other exhibits that flesh out the concept of invisibility.
Women’s Work recognises the leadership of Indigenous women. Topbunk
Women’s Work celebrates the leadership of South Australian Aboriginal Women with striking black and white photographs. Tucked away down the hall on the second level is Fostering Ties, a series of images drawing attention to children in foster care.
This exhibition foregrounds invisibility as a way to contend with our own blind-spots, knowledge systems, biases and cultural frameworks.
What is invisible to us may not be to those from demographics, cultural or language groups that differ from ours.
Drawing attention to the invisible encourages us to shift our perspective. If we don’t have the answer to solve a problem, maybe another cultural perspective – or life form – does.


SUPERFORTUNE Launches AI-Powered Mobile App, Expanding Beyond Web3 Into $392 Billion Metaphysics Market
iRobot Files for Chapter 11 Bankruptcy Amid Rising Competition and Tariff Pressures
Apple Explores India for iPhone Chip Assembly as Manufacturing Push Accelerates
EssilorLuxottica Bets on AI-Powered Smart Glasses as Competition Intensifies
China Adds Domestic AI Chips to Government Procurement List as U.S. Considers Easing Nvidia Export Curbs
Australia’s Under-16 Social Media Ban Sparks Global Debate and Early Challenges
EU Court Cuts Intel Antitrust Fine to €237 Million Amid Long-Running AMD Dispute
SK Hynix Considers U.S. ADR Listing to Boost Shareholder Value Amid Rising AI Chip Demand
SoftBank Shares Slide as Oracle’s AI Spending Plans Fuel Market Jitters
SpaceX Edges Toward Landmark IPO as Elon Musk Confirms Plans
Biren Technology Targets Hong Kong IPO to Raise $300 Million Amid China’s AI Chip Push
Adobe Strengthens AI Strategy Ahead of Q4 Earnings, Says Stifel
Nvidia Develops New Location-Verification Technology for AI Chips
Trello Outage Disrupts Users as Access Issues Hit Atlassian’s Work Management Platform 






