Things—the objects, tools, and artifacts of everyday life—are the material expression of human experience. Without them, we would lose track of what makes us who we are. Things outlast us, and we rely on them to tell our stories when we are gone. Things That Talk is a place for learning the language of objects from the full sweep of humanity. It is a durable, living archive of stories about the interconnected world of things. And it invites ongoing collaboration from a diverse range of communities. The platform facilitates storytelling by giving contributors all the basic curation, sequencing, and visual tools they need to narrate an object. And those tools are also evolving.
To maximize the durability and usability of the platform, we implemented a linked open data system which automates the usage of WikiData within our interface, hosted by the WikiMedia Foundation. This simplifies the contribution process for our storytellers: they contribute images and reference data to WikiData, which is in turn imported into Things That Talk and transformed into an uncluttered, intuitive, and visually appealing experience.
We are grateful to have Faculty of Humanities at Leiden University as our funding partner, as well as the Municipality of Leiden in the context of Leiden Kennisstad / Leiden Science City.
We would also like to thank Nationaal Museum Voor Wereldculturen and Rijksmuseum van Oudheden for being so facilitating and to help us think about how to build bridges between academia and heritage institutes.
And, thank you Q42, Micrio and Fabrique for your awesome work!