‘Access’ to ‘Tools’
Teaching
design
On the cover of the inaugural 1968 edition of the Whole Earth Catalogue (WEC) – a magazine widely viewed as a precursor to Wikipedia, if not the internet as a whole1 – is written the slogan: “access to tools”. This idea – or perhaps mantra – of ‘access’ is what I wish to briefly discuss, as I think it may help us understand whose interests are served by the recent intensification and integration of AI technology into daily life, and what kind of a tool ‘AI’ really is (or could be). In doing so I want to frame this technology not so much as a ‘rupture’ or a radically new technology, but rather a continuation and deepening of the integration of (American) software providers into everyday life and practice, as well as the sets of norms and forms of alienation they establish within our discipline both as graphic designers, and educators.
*
In the introductory pages of the WEC, Stewart Brand and his editors provide a clear criteria of how they choose what ‘tools’ to include in their directory:
An item is listed in the CATALOG if it is deemed:
1) Useful as a tool,
2) Relevant to independent education,
3) High quality or low cost,
4) Easily available by mail.
In the late 1960s ‘access’ meant – in practical terms – the same thing as ‘ownership’; this was certainly the view taken by the WEC, whose listed items were primarily books, hand-tools and walking boots, along with which shops one could purchase these goods from remotely. In the time since this publication launched the term ‘access’ has been radically redefined. The normalisation of end-user license agreements (EULAs), digital rights management (DRM), subscription platforms and other licensing arrangements have established what Aaron Perzanowski and Jason Schultz have called the ‘private regulatory schemes’ of contemporary tech firms; schemes which transform the sale of goods (such as software) from a ‘transfer [of] ownership to the buyer’ into ‘conditional grants of access’ (Perzanowski and Schultz, 2018).
The history of these licenses has emerged alongside the general development of information technologies and the decoupling – beginning in the 1980s with IBM – of hardware and software distribution. The concept of receiving a tool in the post implies a level of physicality, reconfigurability and independence from which we have been distancing ourselves for a long time, but that still conditions our perceived – and perhaps idealised – relationship with the modes of production we engage with on a day-to-day basis. This gradual decoupling and dematerialisation of communication technologies, along with the rise in ‘private regulatory schemes’, has established a norm through which we – as practitioners – are made to continuously consume our tools as services*.*2 Rather than engaging with a tool as a mode of production which might be meaningfully ‘ownable’ (and therefore reconfigurable) in either a collective or individual sense, one rents time with it on the basis of a licensing agreement that nobody quite understands.
Today, such arrangements are often justified through the language of increased ‘access’ and ‘democratisation’. ‘AI-enhanced’ services such as ChatGPT and Canva advertise themselves as ‘general purpose’ – neither requiring nor teaching any skills – and as free to use, boasting generous ‘usage limits’.3 In doing so they seek to integrate both an ‘artistic’ critique of capitalist cultural production (focused on a lack of authenticity and personal expression) and a ‘social’ critique (focused on material inequalities and exploitation of labour) (Boltanski and Chiapello, 2005).4 AI tools therefore claim to fulfill demands for creative autonomy through automation, while simultaneously promising to give us time back for ‘meaningful work’ through automation – understood as the expansion or acceleration of productive capacity. Autonomy requires engagement with process, yet automation removes us from it; free time requires deceleration, yet the logic of production today is one of acceleration and growth.
In my experience, this apparent resolution – which is in fact a contradiction – has intensified the alienation of ourselves and our students from the process and pleasure of creative, productive activity. Where the WEC envisioned tools that could be owned, understood, and reconfigured, contemporary AI tools (and their integration into existing software suites) offer only opaque interfaces to proprietary systems. We cannot inspect their training data, interrogate their logic, or modify their operation – we can only rent access to outputs generated through processes that remain fundamentally illegible to us. Even discounting the environmental costs, the privacy concerns and the ethics of scraped training data, this demand for speed runs counter to sustainable working practices and to the principles of meaningful education, which requires space for slowness, failure and play.
*
However – while obviously interrelated – the history of software licensing and the ideology of Silicon Valley is not the same thing as the history of the technology itself. And there is of course a counter-history of struggle against these dynamics within our discipline, one which includes the WEC, as well as more recent Free and Open Source (FOSS) software projects, and that also stretches as far back as the 19th century, to the Arts and Crafts movement, and perhaps further still, to the chartists and luddites. This history is alive today, and concerns not so much the inherent possibilities of new technologies as the forms of distribution, ownership and control which condition our ‘access’ to them.
If we reinterpret ‘Artificial Intelligence’ as an umbrella term for a broad set of techniques, we may find it to be both larger and smaller than ChatGPT, and the recent swathe of Large Language Models (LLMs). If we want our students to engage meaningfully with these technologies – and we could perhaps interrogate that desire – we might begin with what I would call localised, ‘craft AI practices’. Such an approach builds on the WEC's vision of tools that can be owned, understood, and reconfigured, but also on earlier Arts and Crafts thinkers like Ruskin and Morris (n.d.), who understood experimentation with the organisation and distribution of tools and processes as sites of meaning-making, skill, and collectivisation.
In teaching ‘creative coding’, I have worked with students as co-producers of specific AI systems, designed for particular purposes, built from largely open-source components – taking seriously the possibility of the production of AI tools as a craft practice. We have collectively generated datasets for image generation based on shared research interests. We have used Markov chains and natural language processing tools to explore alternative, generative forms of writing and research. These are the relatively simple computational methods that comprise and predate contemporary LLMs – what we used to call machine learning and ‘big data’. These are also the processes that AI approaches are demonstrably effective at: pattern recognition, statistical inference, the identification of relationships in large datasets. When we hear about legitimate applications of AI technology (medical imaging, protein folding, climate modelling, and – yes – generative art) this is the work being done, not the generation of slop or student essays.
The literary critic Nan Z. Da describes AI as a ‘forcing move for cultural and intellectual work’ because it “forces us to go back to our foundations and define precisely what kind of humane intelligence we can offer”, as well as where and why this is constrained (2025). Following her and Kate Soper's work on post-growth living (2020), we might emphasise the possibilities for both individual and communal control over these means of production – even if this means limiting what they can do, or redefining what we want from them – rather than accepting the terms dictated by platforms whose business model depends on our continued dependency, acceleration and alienation.
Bibliography
Boltanski, L. Chiapello, E. (2005). The new spirit of capitalism. London ; New York: Verso.
Da, N. (2025). Literary Criticism in the Age of AI. New left review, [online] pp.103–127. doi:https://doi.org/10.64590/fmq.
Goldstein, J. (2018). Planetary improvement cleantech entrepreneurship and the contradictions of green capitalism. Cambridge, Massachusetts London The Mit Press.
Perzanowski, A. and Schultz, J.M. (2018). The end of ownership : personal property in the digital economy. Cambridge, Massachusetts ; London The Mit Press. p.58
Soper, K. (2020). Post-Growth Living. Verso Books.
www.marxists.org. (n.d.). William Morris - How We Live and How We Might Live. [online] Available at: https://www.marxists.org/archive/morris/works/1884/hwl/hwl.htm.
This talk was written for the Graphic Design Educator’s Network (GDEN) conference ‘BIG THINGS’.
Footnotes
-
In his 2005 commencement speech Steve Jobs described the WEC as “Google in paperback form”. Stanford (2008). Steve Jobs’ 2005 Stanford Commencement Address. YouTube. Available at: https://www.youtube.com/watch?v=UF8uR6Z6KLc. (13:09) ↩
-
Anyone who has worked at UAL knows this is as true for physical printers as it is for digital software! ↩
-
A revealing turn of phrase. ↩
-
For more on the ‘sociology of critique’ and the productive relationship between ‘artistic’ and ‘social’ forms of capitalist critique, see The New Spirit of Capitalism. ↩