Blog Week 19
Augmentations and plasticity

How does cognition work? Why are we drawn to do things that serve no inherent evolutionary goal? And why does it seem that technology operates in the same way?

Looking at Ian Cheng’s body of work the progression from B.O.B. to Life after B.O.B. mirrors the progress from comparing the mind to a switchboard to current views of the mind as having plasticity.


I guess as we push forward the randomness of development just shows that wherever we are along our multi-dimensional emergent paths we don’t know what we don’t know.

Trauma and plasticity. Breaking things until they work. Fishing around for knowledge in the murky depths of the unknown.

We’ll always be poorly equipped for the future, but the future also holds unknown tools. And we’ll just have to wait to bash together whatever new tools we craft and see what vague new shape they illuminate.

Endless cave system of stuff to wander through. Fire. candle, lamp, torch, floodlight. Still in the dark. 3D scan. Infrared. Sensor. Component, component, component. Radio beams through the rocks. Atomic microscope. Huh the rocks still feel pretty solid, and all this stuff’s still here. Simulation. Exploration. Unmanned robot army mapping unexplored endless paths. Quantum computing an answer to a limited question. GRRRR our brains still don’t have THE answer. Stand on a soapbox and give your opinion on the cave and its mythologies. Paint the image of the cave. Paint the feeling of cold rock. Poem about the sublime cave plants. Map the system. Record the history. Documentary on a cave explorer. Still not there.

My brain’s aching from straining to imagine new dimensions and knowledge. Think there’s only so much you can do in a day.

I think I have more questions than I began with. I guess that’s kinda promising.



Blog Week 18: Final Project update

Since everything’s still a bit wonky with the releasing of class material, and the blog dates, rather than a third entry on OOO, I’ll write a brief update of my final project progress instead.

As the preliminary stage of creating my game/ experience on situated knowledge, I need to create a dataset of abstracted images. To make this I am using a Kinect, with open frameworks to take depth images of various objects (or units of data) and then binarize them into coded images. Though the process binarizes the objects in a way that could not be reliably decoded, it serves to abstract the objects out of a user’s sphere of comprehension. This gives the player the opportunity to decode/interpret the images as they begin to assemble the situated knowledge of their new location.

These images contain colour data as well as depth data. Here is an example of an initial version:





Blog Week 17: Everything’s a bit wonky, we’re back to object oriented ontology.

‘the human body was disciplined, normalized, sped up and slowed down, gendered, sexed, nationalized, globalized, rendered disposable, or otherwise composed.  – The Force of Things, Bennett

Carrying on from my reflections in week 15…

I’m still not sure I know what it’s like to be a thing. I guess I am one though. At least through some lenses. I’ve been child, girl, woman, something.

Autonomous thing? I guess that’s not so bad. If I am an object, I’m privileged enough to not have been reminded of it too often.

I’m also a creator of objects, is a painting having an experience? Or is it a document of mine?

Does my PC miss me when I’m gone? When I replace a component does it lie in some electrical recycling bin longing for the good old days. My sad, sentimental, human lens makes me wonder if they’re thinking like this:

They had liveliness at some point. Did I take that from them? Was I good to them?

Am I just coming back to being human oriented? OOOps



Blog Week 16 Computational Art Data & Witnessing

“the fraught issue of how we might have an ethical engagement with places that are at a distance from us.” – Digital Narratives and Witnessing: The Ethics of Engaging with Places at a Distance by Nishat Awan

For this week’s subject it is almost impossible to not relate it to the ongoing Russian invasion of Ukraine.

My grandma told my mother over the phone that she was impressed by the reactions and sanctions being imposed on Russia by other countries, and that she’d never seen such a strong financial and corporate response to the actions of a country. But what from her perspective is a great outpouring of support instead appears to me as a distancing, a sterilising of an issue. Why focus on opening our borders to refugees when we can stop Russia’s access to swift? What does Sainsbury’s renaming the Chicken Kiev really mean? https://www.independent.co.uk/life-style/food-and-drink/sainsburys-chicken-kiev-kyiv-russia-ukraine-b2028661.html

It all feels so cold, so money and marketing oriented… brands have reputations to uphold, and that looks very different in 2022 than it did for my grandma back in 19?? during whichever other crisis she imagines when talking on the phone to us. Brand identities are social media driven, Ryanair, a brand with a notoriously popular tiktok account has pledged to be the first airline back in Ukraine. https://www.theguardian.com/business/2022/mar/02/ryanair-will-be-first-airline-to-return-to-ukraine-says-ceo

How much of their decision to make this statement hinges on maintaining their popularity? How much of the day should we spend picking apart these motivations before we can get back to real, tangible action? Or is this it?

As media consumers we’re barraged with information on the crisis, and in a post-truth age where media tactics and propaganda are more important than ever, how can we consume ethically, conscientiously. Even the notion of ‘consuming’ a crisis is sickening. 

From fake news, and misinformation to fan cam edits of Ukraine’s president, Volodymyr Zelenskyy, how can we establish a way to navigate being a digital witness?

https://www.indy100.com/news/nastya-tuman-russian-tanks-tikok
https://nypost.com/2022/02/24/ukrainian-women-say-russian-troops-are-flirting-with-them-on-tinder/
https://www.tiktok.com/@valerisssh/video/7072297754285903110
https://www.theguardian.com/us-news/2022/mar/08/portland-to-ukraine-frontlines-russia-invasion
https://vm.tiktok.com/ZMLyCXJgv/
https://vm.tiktok.com/ZMLyCuJ6r/
https://vm.tiktok.com/ZMLyCx852/
https://vm.tiktok.com/ZMLyC4P7D/




Other points from this week’s readings:

Clouds over Sidra -> vr as “empathy machine”. Remote access to empathy feels like a very safe and convenient way to be able to say you can understand someone else’s lived experience.

Forensic Architecture and Forensic Oceanography take a much better stance in my opinion, rather than claiming to be creating empathy they focus on fact and accurate reconstruction.



Week 15 Computational Art and Object-Oriented Ontology.

How do nonhumans experience their existence?
How does the machine experience?
Does a pixel have a favourite colour to be?

There was no class this week, and therefore no readings, so rather than the usual blog I have collated some resources on OOO that I find interesting.

I was especially interested by the discourse in Bibi Burger’s text, in which the intersection of Donna Haraway’s ‘god trick’ with OOO is discussed. Looking at the human as object/ objectified/ othered. The object which can speak of experience.


Boysen, Benjamin. "The Embarrassment of Being Human: A Critique of New Materialism and Object-Oriented Ontology." Orbis Litterarum 73, no. 3 (2018): 225-42.

https://onlinelibrary.wiley.com/doi/pdfdirect/10.1111/oli.12174

‘This discourse heavily anthropomorphizes material reality (earlier said to preclude human reality), thus leaving us with “a kind of spiritualism without gods,” as Slavoj Žižek remarks in Absolute Recoil (Žižek, 2015, 9)’

‘For when things are rendered by signs—aliquid stans pro aliquo, as the schoolmen of the Middle Ages had it—things are always presented by a proxy or substitute in their own absence, meaning that we never get to access unmediated reality.’

‘Semiophobia would thus be the latest tale in a long Western history dominated by the metaphysics of presence.’

‘The utopian sway of new materialism finds an outlet in the image of the pre-critical and not yet individuated mind of the child. The child not yet sharply making distinctions between the inner and the outer, itself and its surroundings, satisfies the call for naiveté and immediacy.’

‘New materialism and object-oriented ontology dream of undoing language and obliterates differences in favor of a regressive, childlike state of immediate unity.’

‘If, then, New Materialism can still be considered a variant of materialism, it is in the sense in which Tolkien’s Middle-earth is materialist: as an enchanted world, full of magical forces, good and evil spirits, etc., but strangely without gods—there are no transcendent divine entities in Tolkien’s universe, all magic is immanent to matter, as a spiritual power that dwells in our terrestrial world. (Žižek, 2015, 12)’


Burger, Bibi. "The Nonhuman Object in Ama Ata Aidoo’s ‘Nowhere Cool’: A Black Feminist Critique of Object-oriented Ontology." Agenda (Durban), 2021, 1-12.
https://www.tandfonline.com/doi/pdf/10.1080/10130950.2021.2011334?needAccess=true


‘The schoolgirl, who is presumably also Sissie, is here called Sarah − an anglicised name perhaps chosen by the teacher, Miss Jones. Sissie is unable to take on the persona of “Sarah” expected of her in the literature class and read the prescribed text from a Eurocentric perspective.’

‘On the other hand, I will argue, when the human-object does figure in Harman’s thought, it is still a case of the western bourgeois Man “which overrepresents itself as if it were the human itself” (Wynter 2003, p. 260).’

‘the contingency central to black feminist thought runs counter to the “overrepresentation of Man” and “god trick” which still underlie OOO as a specific instantiation of posthumanism’

-       In reference to Donna Haraway’s notion of the ‘god trick’. ^^


Cole, Andrew. "The Call of Things." The Minnesota Review (Minneapolis, Minn.)2013, no. 80 (2013): 106-18.
https://read.dukeupress.edu/the-minnesota-review/article/2013/80/106/47982/The-Call-of-ThingsA-Critique-of-Object-Oriented

Harman, Graham. Object-oriented Ontology: A New Theory of Everything. Pelican Book; 18. London]: Pelican Books, 2018.



Week 14 Computational Arts and Touching Visions

Hapticity and marketing haptic experience as desirable because of realism/intensity. One of the first results when googling ‘haptic suit’ is this video: https://www.youtube.com/watch?v=TojeeAoqdSU. In which ‘TheProGamerJay’ attempts to experience real pain in a haptic suit by turning it up to full intensity and playing games in which, he is shot, stabbed etc. In classic youtuber style the results of this aren’t nearly good enough content for hungry viewers, so he mods the suit by lining it with sandpaper in an attempt to experience real pain. This feels like a much more realistic application than adverts such as https://www.youtube.com/watch?v=rFcbVrQWJSU suggest. You mean to tell me people are realistically spending exorbitant amounts on haptic suits just to have a simulated trainer teach them Krav Maga?

Haptics are marketed as sexy, and intense. As providing experiences both beyond the real, and identical to it.

Mattia questioned if these haptic experiences were framed by a puritanical view of the body. The combination of sterile, distant touch and what is portrayed as an endless library of individual, intimate haptic experiences does feel puritanical. Haptic suits and VR experiences seem to be desirable because of the opportunity to have whatever you want without needing to share your body. Donning an additional garment to feel intimacy, whether that be sexual, something else, or the peculiar intimacy(?) of choosing to engage in violence.

It seems to all loop back to the fetishization and taboo-ification of touch, and how this makes it possible to commodify. 


But despite how sterile haptic experiences may seem, an unwanted haptic experience still has the potential to be deeply violating.

“Any function that makes a virtual handshake possible may also enable virtual groping.” - https://publicseminar.org/2018/05/five-theses-on-virtual-reality-and-sociality/




Week 13 Sensing Practices

“Technology must be accepted in all its aspects: as the human dream of optimisation, but also as its consequences and the waste”

“The problem is seeing technology only as the ‘intention’ and not also as the by-products and their extended links/network.”

- Elaine Gan

Feral Technologies: Making and Unmaking Multispecies Dumps

https://www.anthropocene-curriculum.org/project/campus-2016/feral-technologies


Looking at the wide array of data collections and sensing practices from this week’s lecture I’m struck by how human they can be. Many of the examples centred on collecting data from nature feel like the computational fever dream of some eccentric historical figure who would’ve spent decades out in a forest with a notebook. Though these sensing practices are vastly accelerated by computation, they feel like a natural progression from previous modes of study. Due to the difference in data collection methods and transparency between practices like this, and those which involve scraping the internet, or other unscrupulous and opaque methods, I find myself feeling much more forgiving about this optimisation of study.

Sensing practices which make data accessible and legible to all audiences have the potential for very powerful application. Especially if they collect quality data in an ambient(?) /unobtrusive/ ethical way.

That being said, how much data is too much?

We already collect data just for the hell of it, will that ever end, or are we destined to incessantly prod and poke and observe the world around us? Did anyone consult the insects before we started monitoring their traffic? Why do we feel entitled to data on so much, so much of the time? The grim pursuit of knowledge. I’m tired. Maybe even if I went to live in the wilderness survival style someone would wire up a bird to a sensor which would eventually give away my whereabouts because of an unusual pattern in the bird’s behaviour. The bird has found an unnatural food source ALERT someone’s out in the woods eating cooked mushrooms.



Week 12 

Batool El Dasouki Abdalla’s talk this week brought up many interesting ideas for me. Their work is centred on magic and art, and they spoke in length about the importance of defining these terms when producing work. Overall, they focused of magic as ritual, and as the ‘pursuit of understanding unintelligible knowledge’. I wrote my undergraduate dissertation on magic and technology and so found it very intriguing to see the difference in our lenses on the subject. My research was framed through an industrialised western lens, and because of this I found it difficult to frame my findings in a positive light. My research into magic continually came back as individualistic, tending to utopian desires, and I believe that this was primarily because it was heavily influenced by productivity culture and the pursuit of technology in search of greatness.


This week in class we explored the idea of critical making, including the perspective of ‘maker’ as a sanitised or institution friendly version of ‘hacker’. The premise of critical making as a way of reframing the relationship between technology and society through personal investment seems to be why this ‘maker’ compromise is necessary. Though I believe ‘hacker’ to be a much more powerful and liberating term, I can appreciate the necessity for ‘maker’ in an institutionalised space. I also think that ‘maker’ benefits from not carrying all of the same context as ‘hacker’, it is able to shed the weight of the heavily generalised vision of a hacker as anonymous loner criminal man. In terms of hacker artist vs critical maker artist I find myself disinclined to the latter. But perhaps that is more a reflection of my distaste towards the institutionalisation of art. I’d prefer to not sanitise art to fit the bounds of funding application or the gallery space. But having said that the hacker as artist is likely to only be embraced by the art world if they fit into the same brand space as the masculine anonymous or big man icon. I think my problem is just with the institution and the art world to be honest. Is there any way to be ethical and successful without compromising your work, commercialising your identity into a brand, or sanitising yourself?

Whatever the solution is I’m not going to figure it out today, so I’ll leave it there.


Week 11

In Material Semiotics, John Law states “Everything becomes an ‘actor’ not because people aren’t human but because this is methodologically useful”.

Allowing ourselves to move past the feeling of needing to reason with flat ontology as an idea gives the opportunity to move into a less anthropocentric view of importance in webs. Perhaps. Briefly entertain my attempt to see from a not human centric perspective as I wax lyrical from within my very human body. I’m giving it a go.

In Staying with the Trouble, Haraway writes “It matters what stories make worlds, what worlds make stories”.

The ideas of tentacular thinking that she explores in this text, of feeling out ideas, weaving and trying, of considering but maybe never deciding, resonates with me. That perhaps a tentacle of thought (? human) might reach out into its surroundings, reason a conclusion based on a brain (or some other system of knowledge/reasoning/data storing) at one end and an environment at the other.

Though one assumes the owner of the tentacle comes into being with some innate knowledge of tentacle-ing, most of its ability to feel and reason and maybe understand are externally influenced. Sympoeisis. The tentacle did not make itself, and it did not become the tentacle it is alone. But here it is being a tentacle in a web in which it influences and is influenced.

I hear that mother octopuses die right as their babies hatch. They lay their eggs down in the deep and latch themselves to a rock. They have evolved to hold out without food until such time as the babies are nearly ready. Even if the baby octopodes feel as though they have come into being independently, they are standing on the shoulders of their mother who went before them. Their mother allowed them the time to develop into little tentacled beings that will hopefully make it on their own. They join a web she was once a part of. Is still a part of? It will change with them as it did with her, growing and ebbing like a perpetual stew. Here I am anthropomorphising an octopus and reasoning its semelparous existence as sadly grim within my own frame of understanding. Hmm.

In John Law’s text he works through several examples of actors and webs. He explores a specific web in which the three sets of actors: fishermen, scientists, and scallops perform their fragile web. A web which is broken by a broken agreement. He writes that “Everything is a process”. Some webs may seem solid, but this seems to come down to an entrenched history of the relations of the web being adhered to. In the web in this example the fishermen stuck to the new web for a time, but eventually reverted to the established system. The scientists attempted to create a new protected scallop population, and it was a success, but then the fishermen fished the scallops. The fishermen had their own webs, and for them the immediate and familiar webs took precedence.


I seem to have written a lot about the ocean now so I think I will leave it there.



WEEK9/10
https://www.youtube.com/watch?v=Ghx0sq_gXK4

Loss of personal sovereignty, loss of privacy, increase in privatisation, platforms selling back and forth between themselves trading confidence in confidence.Moving past capitalism into what is being described by Varoufakis as technofeudalism. A new system that has emerged with platforms, a churning new reality in which the new lords are disguised by parent companies, platforms, subsidiaries, multinationals… some other opaque load of jargon.

Listening to Varoufakis talk, and Zizek unusually not, I can feel myself having a visceral reaction to his theory. There’s something deeply uncomfortable about having to listen to a verbalisation of a reality we’ve all seen the edges of, felt the injustices of, but not necessarily fully mapped out. Having done gig work, including working for a call centre that specifically handles calls outsourced from the NHS I feel as though I’ve seen one miniscule corner of the underbelly of some of these systems. It feels distinctly unnerving to be told that this job, this way of being, took your sovereignty. I felt it when companies began using webcams to monitor the attention of their employees in work from home roles. I felt it waiting till 7pm on Christmas eve to know if I’d be working on Christmas day of the worst year of my life. I knew it taking test after test on GDPR rules and robotically repeating privacy statements I’d be flagged for missing, only to be replaced by an additional level of IVR.


But somehow watching Slavoj Zizek, a figure I had firmly placed in the sphere of my academic unreal life, ask for an elaboration feels like he’s looking at a part of my life I cannot happily reconcile with actual me. “Work mode” … “Focus mode” … productivity, productivity, productivity. Why is so much of oneself reduced by this style of work? 

In Neal Stephenson’s Snow Crash, which contains, what is as far as I know, the earliest talk of a metaverse, the book opens with a detailed description of how economic systems have created the need for a high speed armoured and armed pizza delivery service. Hiro Protagonist is the Deliverator. Protecting the core experience of life in an America so devoted to private corporations and personal freedoms that in order to deliver a pizza to a gated suburban “burbclave” you’ll need a gun and what is essentially a tank.


Stephenson’s opening description of the urgency with which Protagonist must work in order to meet exacting company standards feels all too familiar. The ways things are heading perhaps the most unrealistic part of Stephenson’s vision is that Protagonist’s pizza deliver tank is company provided.


WEEK 8

Reading through Garnet Hertz’s Disobedient Electronics the most striking projects to my mind are the machines that exploit loopholes by shifting responsibility to the machine, enabling the humans behind the projects to circumvent the law. In this collection the circumvention of the law is generally done to work around outdated/ unethical/ extreme laws. For example, the abortion drone, which delivered abortion pills from Germany, where they are legal, to Poland where they are not.  Also, the Campus Carry doorbells, which read aloud the official statement needed to ban concealed handguns from a specific space on a university campus, thereby removing the onus from the individual to consider, remember, and recite the statement to anyone entering their space.

There’s something interesting here about utilising the same distancing from blame by putting the responsibility on an unthinking machine as we discussed earlier in the term when we looked at algorithms, synecdoche, platforms etc.

Jacob Gaboury opens Critical Unmaking: Toward a Queer Computation with a passage from Ray Kurzweil:

“If there is an ethos that drives our contemporary digital culture, it is the belief that technology improves over time and that improvements in technology bring improvements to our lives. It is this desire for the new that supports our faith in the potential good of technology in bringing about a better future (Kurzweil 1990)”

Disobedient Electronics rejects the idea of waiting for technologies to bring about a better future, instead exploring the potential of hacking/D.I.Y.- ing new technologies to suit your own agenda.

Though these projects are generally prototypes/ artworks with no way to viably mass produce them to solve large scale issues, a lot of them have the potential to be reproduced and seem to be very open about how they were made. There are even some examples of instructions to D.I.Y. some of the projects in the book. Instructions give transparency and accessibility to the projects, but also have the potential to give the projects much greater reach whilst not relying on standard flows of production etc.


WEEK 7


Cyborg as freedom versus cyborg as constrained. Transhumanist fantasy is the notion of cyborg as freedom from biology, but in cyborg the conscience remains situated. The situation around it may be altered by the addition or evolution (?) of technologies. But it remains situated always.

In Donna Haraway’s Cyborg Manifesto she writes:

“Late twentieth-century machines have made thoroughly ambiguous the difference between natural and artificial, mind and body, self-developing and externally designed, and many other distinctions that used to apply to organisms and machines. Our machines are disturbingly lively, and we ourselves frighteningly inert.”

To Haraway the cyborg is no far off concept. It is already here, and therefore is not something that should percolate through fantasy and fiction, dripping into great pools of mirage to dive into for escapism.

Haraway comments on our inertia, and I think this is where the divide in opinion really comes into play. In a society structured around and obsessed with productivity, the potential of a cyborg form with limitless energy, a searchable brain with neat racks of data. What a beautiful thing to picture, and an impossibly great joy to experience such ease. Haraway also writes that we should all prefer to be cyborgs than goddesses. But isn’t the cyborg of fiction the same thing?

As David Porush says: “Every time culture succeeds in revolutionizing its cybernetic technologies, in massively widening the bandwidth of its thought-tech, it invites the creation of new gods.”

Cyborg and God may be equally aspirational roles, but goddess seems to hold a more limited position. Separated by suffix, goddess conjures up images of nature and wisps and fruit trees while God smacks of power, wrath, lightning, and war. How unfair. And how unfair that I have not associated power with nature.


Machines are embodied just as we are embodied. We share the experience (?) of being situated. Situated in the world. In culture. In nature. In physicality. 

How do we move forward together? Do we cease our attempts at creating them in our image, and then striving to compete with our creation? Do we merge ourselves? Or separate completely?

How does our form limit us, and how do we change those limitations?


WEEK 6

This week our focus was on situated knowledge. We had a lecture from Goda Klumbyte, who asked us to map how we are situated, and to compare it with those around us, eventually going on to work through a brief together using our range of situated knowledges.

In the lecture Goda referred to Haraway’s reflection son ‘The Persistence of Vision’. The eye, be that human, animal, computer, is always embodied. There is no possible all-seeing eye, as to see the eye must be situated. An eye in a space, or with abilities, other than our own can go some way to making the illusion of being all-seeing by expanding what we can naturally see, but they are only ever able to offer a window into another limited situated view.

In Donna Haraway’s Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective, she writes “Feminists don’t need a doctrine of objectivity that promises transcendence, a story that loses track of its mediations just where someone might be held responsible for something, and unlimited instrumental power”. She goes on to reflect that a blissful ‘natural’ state where everything just exists sublimely is an impossible place to be empowered (?). She wants to challenge and make change. The swirling natural state of being seems to be a primordial soup to evolve and climb from, not a transcendent empty-headed bliss to run unconcerned toward.

Desire for a ‘natural’ state, or for transcendence, seems to be a desire to be disembodied. To be freed from the constraints of your situation. To no longer feel any need to find community, to neither feel lost nor found. To no longer identify with anyone. Wouldn’t it be nice to be de-situated? An ephemeral, all-seeing being? Total privilege in being unable to be held accountable. It feels pretty numb to think like this. A desire to isolate oneself totally in order to become linked with everything feels impossibly counterintuitive.

Histories of science may be powerfully told as histories of the technologies. These technologies are ways of life, social orders, practices of visualization. Technologies are skilled practices. How to see? Where to see from? What limits to vision? What to see for? Whom to see with? Who gets to have more than one point of view? Who gets blinded? Who wears blinders? Who interprets the visual field? What other sensory powers do we wish to cultivate besides vision?

- Donna Haraway, “Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective”

The grounding feeling of really understanding something, or seeing the inner workings of a machine, of truly grasping your situation could not feel further from being disembodied, yet somehow, they still give that endorphin rush we all crave. Perhaps I’m thinking about transcendence all wrong. Maybe we all are.

Maybe our disembodied visions just feel easier.

WEEK 5

Keywords for group project:
Fiction/fantasy/promise/utopianism and emerging tech, hyperreality, transhumanism, tech as fundamentally rooted in the physical, existing structures of power and how they can span realities (Metaverse, fictional and real).

This week our focus in class shifted from research to potential ideas for our group project brief.
The keywords I put forward centred on the looping ideas of the fantastical promises of technology, and how they compare to the fundamental physicality of tech. I’m interested in Le Guin’s ideas on sci-fi and the potential of sci-fi to be tragic if it only concerns itself with technology.

We’ve spoken a lot about the Metaverse in class, specifically the real-life version, being brought about by a horrid platform/ government combo. But the irony of naming the future of the internet after the fictional metaverse from Neal Stephenson’s Snow Crash, in which the overarching takeaway was that existing power/class structures are impossible to keep out of any reality, virtual or other, which is founded in the physical/ real. I felt as though this really encapsulated the ideas of the relationship between the fictional and real. Some kind of choose your own meaning/reality for the uber powerful. There is an overall sense that it is easy to see tech as bright and clean and full of potential if you have the privilege to ignore its physicality. If it works seamlessly in the background of your life, only coming to the forefront as a plaything, an experiment, something exciting.

As of today, we’re still working on choosing groups for the project, but the overall ideas for the larger group I’m in are:
Fiction/fantasy/promise/utopianism and emerging tech, hyperreality
Post-internet art i.e., Jon Rafman, Luyangasia, etc
ecology, alternate and simulated reality, simulacra
Digital Primacy
A Possible Future, Dystopia, Storytelling; Intelligent Machine
-VIRTUAL WORLD
- Establishment of emotional system of AI
-《Her》《free guy》AI ethics Based on film research
The relationship between ecosystems, Biology, and machines,
Game Theory, Service Design
Speculative futures, Fiction/fantasy, Hyperreality, Dystopia.


WEEK 4

Watching AI Debate: How far can the AI revolution go? it seemed as though the overall theme was not how far AI is able to go? But more, what is far? How do we measure the progress of something that key figures in the scientific debate are reluctant to define? Sir Roger Penrose reasons that “consciousness is not just something which happens to come along because something gets very complicated.” Simply increasing the abilities, connections and power of a machine is surely not enough to spark consciousness, that is, if it is a thing to be sparked. Where is the line between computing complex connections between immense banks of information, and understanding? A person may feel as though they understand a subject, but is this anything other than an established set of connections between information? Could a glitch feel like déjà vu?

In Turing’s Computer Machinery and Intelligence, he outlines his Imitation game, a test to conclude how intelligent a machine is by seeing if a human is able to recognise it as a machine (through typed language only). But is this human centric test not also a test of the human’s intelligence? If an otherwise ‘intelligent’ machine is not able pass as human, it is deemed not more intelligent than the human. Does this mean the inverse is true, if a human is particularly bad at judging this test, is the human outright less intelligent than the machine? Where is the room for spectrums and types of intelligence? What would the benefit be of making machines better at impersonating humans?

Mattia showed us a brief clip of the Google assistant impersonating a human and booking an appointment at a hair salon. The assistant goes unnoticed by the human on the other end of the call, and successfully passes as a person. Is this really the pinnacle of intelligence? Where is the understanding?

Our inability to measure if a machine has understanding is tied in with our inability to comprehend our own consciousness. There is no ‘it’ in computing, but is there an ‘it’ in us, is consciousness located? A glowing orb of being? Or is it the collective experience, the receiving and emitting of signals from a whole buzzing network?

If we can’t pinpoint an ‘it’ in ourselves, how can we ever hope to assess if a machine has one?

Perhaps we are working too hard at making machines in our own image. In To Be a Machine Mark O’Connell writes ‘Frustrated gods that we are, we have always dreamt of creating machines in our own image, and of re-creating ourselves in the image of these machines.’

Is the pursuit of computed intelligence just chasing a dream of omniscience, omnipotence, omnicompetence? Is this some shiny tech Ironman Robocop male hero fantasy? Have we forgotten the dirty raw materials and labor of tech? If we pinpoint consciousness in AI perhaps massification will be able to deliver us a reality in which the NPC in your video game will really wonder: "What is better - to be born good, or to overcome your evil nature through great effort?" Though I hope not. The ethics of interaction in a fantasy setting with NPCs with consciousness would surely be far too much of a headache.

In the AI debate Dr. Eugene Izhikevich jokes that we were promised robots. As the CEO of Brain Corporation he’s delivering on the promise. Perhaps the autonomous machines of today just aren’t as heroic as we wanted them to be. The powerful vision of a conscious AI, close enough to us to be integrated into society, but intelligent and autonomous enough to be intriguingly dangerous, still hangs just out of reach. It’s good to have something to dream about, I guess.


WEEK 3

In Ursula Le Guin’s The Carrier Bag Theory of Fiction she outlines the idea that the vessel is the first tool/ technology, and it enabled, amongst many other things, the evolution of fiction. Le Guin theorises that the storyteller or protagonist is the bottle in which the story is carried. Therefore, changes to the bottle change the story.Marshall McLuhan’s ‘The medium is the message’ springs to mind.

“If science fiction is the mythology of modern technology, then its myth is tragic” writes Le Guin, reflecting that so much of science fiction is defined by the hero. A notably masculine figure with hard edges, and strong capital ‘t’ Technology. To Le Guin, it seems as though these tales of out of reach potential for power, and fantasies about greatness are tragic. They have overlooked the potential for softness, near reality, blurring lines that can be used in science fiction.

Mark O’Connell’s text To Be a Machine comes to mind; he investigates people interested in making these hard sci-fi dreams a reality. Grindhouse Wetware, transhumanists, Terasem. Though all these people seem totally devoted to their respective causes and beliefs, there is a definite air of tragedy and loneliness that comes with the desire to transcend the world in which everyone else resides.

In Blaise Agüera y Arcas’ Art in the Age of Machine Intelligence he writes ‘Cameras are “thinking machine”’ and ‘Photographers are cyborgs.’. Going by Le Guin’s ideas on technology, we’ve been cyborgs since the day we began to use the vessel as an extension of ourselves. I think the idea that being a cyborg is a far-off vision of greatness is what drives hard science fiction, there needs to be an intense dissatisfaction with the current state of being to give escapism any real edge.

As artists working with computation as our medium, how do we navigate the medium and its context?

In Walter Benjamin’s The Work of Art in the Age of Mechanical Reproduction he writes ‘If the natural utilization of productive forces is impeded by the property system, the increase in technical devices, in speed, and in the sources of energy will press for an unnatural utilization’. The existing set up of power within tech only serves to distract us and reduce meaningful productivity. Can art challenge this? Do we have to challenge this from within the constraints of these platforms to get any real viewership?

If we were cyborgs from the age of the vessel, why does it feel as though computation needs a whole new set of rules. Is the universal machine too complex to fit clean rules to? Or do we treat it the same as the camera, the brush, any other tool? The brush carries immense context. Is the difference here that the context of the universal machine is carried in a searchable fashion within it? Is it technophobia and techno-spirituality? Is the medium too magic?


WEEK 2

This week our readings and in class session were focused on algorithms. In Tarleton Gillespie’s article he writes about ‘algorithms as synecdoche’. Although algorithms themselves are intrinsically literal, the use of the term ‘algorithm’ as a stand in for a complex ‘sociotechnical assemblage’ has created mystery around the term. That isn’t to say that algorithms themselves are necessarily simple or easy to understand, but that the expanded meaning of the word creates an added level of mirage.

In class the subject of emergence was brought up, in particular the idea that algorithms are a product of emergence, and therefore so is the expanded meaning of algorithm. The mirage of the algorithm and the ideas of emergence both hum with the same vagueness and opacity that keep the layman in his place. How can we empower ourselves to connect with and understand our data and the power it holds? How does this opacity benefit large platforms, who shift blame to a seemingly infallible algorithm whilst still holding all the rights to it?

Throughout the session we looked at examples of works influenced by algorithms.


MUC Love Letters (Manchester University Computer Department)

1953-54

“DARLING SWEETHEART

                       

                        YOU ARE MY AVID FELLOW FEELING. MY AFFECTION CURIOUSLY CLINGS TO YOUR PASSIONATE WISH. MY LIKING YEARNS FOR YOUR HEART. YOU ARE MY WISTFUL SYMPATHY: MY TENDER DARLING.

YOURS BEAUTIFULLY

M.U.C.

(Christopher Strachey)



I found this example from the session really intriguing, what a beautiful little note from a machine that has no idea what it is talking about. Is the illusion of knowing unique to machines, what makes its understanding through followed rules different to ours? Aren’t we just very advanced at joining the dots?

The creative power of an algorithm// artificial intelligence is huge. Can there be a more devoted creative/creator?  In Ron Eglash’s Ted talk on African fractals he reveals the longstanding use of algorithms by humans as a creative tool. In Matteo Pasquinelli’s text When Algorithms Learned How to Write he explains that ‘Before digitalization, books revolutionized knowledge. They helped disburden our memory.’ The algorithms, search engines, digitisation aren’t making us stupider, they are a new tool, a new place to store our memories and learnings. The knowledge passed to Ron Eglash through ritual had to be protected by an ancient system of belief to keep it accurate and safe. Now we have digital, searchable knowledge. Does the computer hold a new ritual? Spirituality and magic in the machine.


WEEK 1

The reading for this week, Femke Snelting’s A Fish can’t judge the water, reflects on the ostensibly seamless integration of software into our lives, and the jarring liminal moments in which we see its presence outlined. These moments often appear as we approach the boundary of a software. Or at the moments when the software demands an approval, a box to be checked, or a plug-in to be installed. The software needs our human help. Not because it cannot do these things, but because it has not been allowed to. Borders at which it must seek approval.

These same crunching edges are felt on glitches and errors. At incompatibility. They pique our interest; a great glitch can be quite something to behold. Visual glitches form maps, fizzing images that both cover and reveal the magic of the machine.

The politics of software is often concerned with these boundaries, one party wanting them sealed up, the only access a sterile window view inside. Import. Export. Don’t set foot inside. The other party wants these boundaries to be defined only to set one tool apart from the other. And even then, compatibility should allow blurring of these lines. From this perspective software is a recipe. Permeable, changeable, shareable.

This week Mattia asked us to bring an example of how software choreographs life to respond to Snelting’s writing. I chose to think about how software dealing with time choreographs our existence. How many people now wake up at the exact same standardised 7am? Pre-software 7am was an approximation. A bell ringing in the square to set your watch to, the radio and its delay, the arrival of a train. Trains are the original reason for any form of standardised time. Industrialisation gave us the steam engine, and with it came the need to know when it would set off from one town and arrive in the next. Now new global connections, and industries, sustain our need for standard time. Arranging a zoom meeting with someone in a different time zone? You’d better google the time difference, so you don’t accidentally arrange a midnight call.

I suppose in protest I could go analogue but why would I own an unstandardised watch or alarm clock when my all-in-one solution has such a neat piece of software to use? Our interactions with time are generally through software on phones, these apps offer a sealed-up set of interactions available. Rounded minutes for alarm input give me the option to wake at 07:00 or 07:01. Only a contrarian would choose to wake at time not ending in a 5 or a 0.

I would imagine a good chunk of people will have at least one phone alarm that goes off every day.  For most people the only reason they would have to stop and consider it would be if it fails. A charger switched off at the wall, phone dead in the morning, a much needed lie in until day shatteringly late.

Most of the time one’s interaction with software time also goes unnoticed by other people, you wake alone, and choreograph your timings for the slot before you meet someone. The pinch point of the time you agreed to meet joining you and your schedules up like dots. The omnipresence of time means it takes a standout moment for it to be brought up. A moment like this is notable, apparently notable enough to make a TikTok about. This video points out the widely known experience of a group of girls’ [1] alarms all going off at the exact same time, 9pm, while they are in a room together. Since the alarms go off at the exact same moment, the relationship with the software/ clock app no longer disappears into the background, it is brought to the foreground by an inside joke that everyone who’s alarm goes off at the same time is privy to. The video infers that the alarm is a reminder to take birth control, a not-so-subtle Megan Thee Stallion audio reaffirms this. The software is so ubiquitous that it is able to create culture, bringing to the fore otherwise private and individual experiences in such a, literally, loud way that these moments become the stimulus for viral videos.

Another perspective into our relationship with time and software has to do with the ideas of monochronic versus polychronic time. The ideas of monochronic and polychronic time did not come about because of software, but places with inflexible monochronic views of time generally seem to be places that are also more ruled by software, or have been industrialised for a longer time, and therefore have a longer relationship with standardised time. There is a lot of room in the discourse about time to look at inequality produced by the expectations of a society ruled by inflexible software time, and the expectations it brings about. This is political, and too large a topic for today, but something to think further about.



[1] The video mentions ‘girl’s, which is why I use the word. The point of this example is not to include or exclude anyone, but, although it is a generalisation, the use here does potentially serve to clarify what is being inferred without outright stating the punchline.