(Roughly) Daily

Posts Tagged ‘Archaeology

“In the lingo, this imaginary place is known as the Metaverse”*…

Ethan Zuckerman on the history of enthusiastically working to make a dystopian vision real…

In a booth at Ted’s Fish Fry, in Troy, New York, my friend Daniel Beck and I sketched out our plans for the metaverse. It was November 1994, just as the graphical web was becoming a thing, and we thought that the 3-D web could be just a few tweaks down the road. In our version of the metaverse, a server would track the identity of objects and their location in virtual space, but you’d render the objects locally, loaded to your hard drive off of a CD-ROM. It made a certain sense: Most users were on sub-56k modems, and AOL was shipping out enough CD-ROMs to pave Los Angeles each week.

To be very clear, Daniel and I were in no way being original. We were hoping to re-create the vision that Neal Stephenson had outlined in his 1992 book, Snow Crash. We were both (barely) self-conscious enough to understand that Snow Crash took place in a dystopia, and that Stephenson was positing a beautiful virtual world because the outside world had become so shitty that no one wanted to live in it. But we were young and naive and believed that our metaverse would rock. (Stephenson, of course, wasn’t being entirely original either. His vision of the metaverse owed a debt to Vernor Vinge’s 1981 True Names and to a series of William Gibson novels from the ’80s. Both of those authors owed a debt to Morton Heilig’s 1962 Sensorama machine, and on and on we go, back in time to Plato’s shadows on a cave wall.)

Daniel and I got a chance to actually build our metaverse about six months later, after we both joined Tripod as graphic designers and “webkeepers.” This was well before Tripod became a competitor to GeoCities, offering free webpages to all. (It was also before I accidentally invented pop-up ads. Sorry again about that.) Instead, we were a lifestyle magazine for recent graduates, providing smart, edgy, but practical content—“tools for life”—while hawking mutual funds to 20-somethings. When that business model didn’t take off (can’t imagine why), the half-dozen folks in the “tech cave” revived the metaverse idea…

We sold our CEO on the idea by telling him that the MOO could be a simulation of life in the big city postcollege, bringing onto the site new users who wanted to experience New York City while still in Ann Arbor or State College. And remember, this was 1995: The photos we used to represent this metaverse of ours were taken on chemical film! Which we then developed at a photo-processing lab! And then scanned on a flatbed scanner!

The MOO was really cool, in theory. Most people weren’t building HTML-enabled multiplayer spaces in 1995. It got us our first round of venture-capital funding, demonstrating to our investors that we weren’t just kids translating mutual-fund propaganda into HTML. We were technology innovators. We were building things no one had ever seen before.

But here’s the thing: The MOO was garbage. On a good day, I could give a demo that made it look smooth, slick, and fun to use. But our CEO couldn’t. And that was a problem. It wasn’t his fault. The MOO was buggy and quirky and demanded that you think of the world as a set of six-sided cubes made up of webpages. Our boss pulled the plug on the project, telling us, “I know it’s the future, but if I can’t use it, I can’t sell it to investors.”

I watched other metaverses rise and fall. An Icelandic firm, OZ Virtual, introduced a metaverse with 3-D avatars in sexy streetwear dancing on an infinite dance floor, which felt like the future for a few days. OZ Virtual used VRML, a format for specifying 3-D objects in an HTML-like language that was all the rage for a few months in 1996. Netscape supported it via a plug-in, and Blaxxun built a 3-D chat space. Don’t remember these moments of web history? Neither does the web, for the most part. Wikipedia’s thorough, but not comprehensive, timeline of virtual environments misses our MOO, the Icelandic dance club, and half a dozen other early virtual experiments. (By the way: “Blaxxun”? That’s another Stephenson reference, to Black Sun Systems, the fictional company that created Stephenson’s fictional metaverse. Very creative, guys.)

And then there was Second Life. When Linden Lab launched this metaverse in 2003, there was a brief burst of enthusiasm where otherwise serious entities, such as businesses and universities, bought and built out their own islands in Linden’s proprietary world. (Harvard’s Berkman Center for Internet and Society, now the Berkman Klein Center, had its own island.) The learning curve to build objects in Second Life was steep, the universe was populated haphazardly, and the Second Life client demanded a very fast computer and a very patient user…

So, after watching metaverses spring up and crumble for 27 years, and after building one myself, I feel fairly well equipped to offer context for what Mark Zuckerberg is trying to do with his firm’s pivot to “Meta.” In his heavily produced keynote video for Facebook Reality Labs, Zuckerberg starts by acknowledging that this is a bizarre time for the company to be launching a new product line—Facebook is under more scrutiny than ever for its ill effects on individuals and societies, and for the company’s utter unwillingness to address these issues.

But why bother with that mess? Or, as Zuckerberg put it: “Now, I know that some people will say that this isn’t a time to focus on the future. And I want to acknowledge that there are important issues to work on in the present. There always will be. So for many people, I’m just not sure there ever will be a good time to focus on the future.” Allow me to translate: Fuck you, haters.

Let’s be frank about this: Facebook’s metaverse sucks. From the first images in which legless torsos sit around a conference room, staring at a Zoom-like videoconferencing screen, to Zuckerberg’s tour of his virtual closet, filled with identical black outfits (see, he’s got a sense of humor!), Zuck’s metaverse looks pretty much like we imagined one would look like in 1994. Look, I’m playing cards with my friends and we’re in zero gravity! And one of my friends is a robot! You could do this in Second Life 10 years ago, and in somewhat angular vectors in VRML 20 years ago…

The metaverse Zuckerberg shows off [is] promising future technologies that are five to 10 years off. But it still looks like junk. The fire in his fireplace is a roughly rendered glow. His superhero secret lair looks out over a paradise island that’s almost entirely static. There’s the nominal motion of waves, but none of the foliage moves. It’s tropical wallpaper pasted to virtual windows. The sun is setting behind Zuckerberg’s left shoulder, but he’s being lit from the right front. Even with a bajillion dollars to invest in a video to relaunch and rename his company, Zuckerberg’s team is showing just how difficult it is to create a visually believable virtual world.

But that’s not the problem with Zuckerberg’s metaverse. The problem is that it’s boring. The futures it imagines have been imagined a thousand times before, and usually better. Two old men chat over a chessboard, one in Barcelona, one in New York, much as they did on Minitel in the 1980s. There’s virtual Ping-Pong and surfing, you know, like on a Wii. You can watch David Attenborough nature documentaries, like you do on Netflix. You can videoconference with your workmates … you know, like you do every single day.

Zuckerberg isn’t building the metaverse because he has a remarkable new vision of how things could be. There’s not an original thought in his video, including the business model. Thirty-eight minutes in, Zuckerberg gets serious, talking about how humbling the past few years have been for him and his business. Remember, he’s not humbled by the problem of Russian disinformation, or the spread of anti-vax misinformation, or the challenge of how Instagram affects teen body image. No, he’s humbled by how hard it is to fight against Apple and Google.

Faced with the question of whether Facebook’s core products are eroding the foundations of a democratic society, Zuckerberg takes on a more pressing problem: Apple’s 30 percent cut on digital goods sold in its App Store. Never fear, though: With a Facebook ecosystem, Facebook developer tools, and Facebook marketplaces, the custom skin you buy in one video game will be wearable in another video game, just like Mark’s black T-shirt. Just as long as that video game is in Facebook’s metaverse. (Meta’s metaverse? Meta’s verse?) And if you want Mark’s actual digital shirt, it will almost certainly be available as an NFT, which the launch video promises will be supported. Did I mention how dystopian this all is?

Facebook can claim originality in at least one thing. Its combination of scale and irresponsibility has unleashed a set of diverse and fascinating sociopolitical challenges that it will take lawmakers, scholars, and activists at least a generation to fix. If Facebook has learned anything from 17 years of avoiding mediating those conflicts, it’s not apparent from the vision for the metaverse, where the power of human connection is celebrated as uncritically as it was before Macedonian fake-news brokers worked to sway the 2016 election…

Neal Stephenson’s metaverse has been a lasting creation because it’s fictional. It doesn’t have to solve all the intricate problems of content moderation and extremism and interpersonal interaction to raise questions about what virtual worlds can give us and what our real world lacks. Today’s metaverse creators are missing the point, just like I missed the point back at Ted’s Fish Fry in 1994. The metaverse isn’t about building perfect virtual escape hatches—it’s about holding a mirror to our own broken, shared world. Facebook’s promised metaverse is about distracting us from the world it’s helped break.

It was terrible then, and it’s terrible now: “Hey, Facebook, I Made a Metaverse 27 Years Ago,” from @EthanZ.

For a nuanced (and provocatively-“optimistic”) look at what a metaverse like Facebook’s could yield if in fact it worked (and then morphed), see Corey J. Whites‘s (@cjwhite) Repo Virtual.

And as (and for the reasons) noted in an earlier post, see “The Metaverse Is Bad,” from Ian Bogost (@ibogost)

* Neal Stephenson, Snow Crash

###

As we think twice, we might send adventurous birthday greetings to Giovanni Battista Belzoni; he was born on this date in 1778.  The 14th child of a poor barber in Padua, he was a barber, a Capuchin monk, a magician, and a circus strongman before finding his true calling– explorer (and plunderer) of Egyptian antiquities.

Belzoni’s call to action came when he met a British Consul-General named Henry Salt who persuaded him to gather Egyptian treasures to send back to the British Museum.  Under extremely adverse conditions he transported the colossal granite head of Rameses II from Thebes to England, where it is now one of the treasures of the British Museum. Later, he discovered six major royal tombs in the Valley of the Kings, including that of Seti I, and brought to the British Museum a spectacular collection of Egyptian antiquities. He was the first person to penetrate the heart of the second pyramid at Giza and the first European to visit the oasis of Siwah and discover the ruined city of Berenice on the Red Sea. He stumbled into the tomb of King Ay, but only noted a wall painting of 12 baboons, leading him to name the chamber ‘Tomb of the 12 Monkeys” (because hieroglyphs had not yet been deciphered, he usually had no idea who or what he had actually found).

Belzoni had two habits that have contributed to his legacy:  he was a lover of graffiti signatures, and inscribed “Belzoni” on many of Egypt’s antique treasures, where the carvings survive to this day.  And he carried a whip: which, given that he was one of the models for Indiana Jones, became one of that character’s hallmarks.

 source

“People are trapped in history and history is trapped in them”*…

The late David Graeber (with his co-author David Wengrow), left one last book; William Deresiewicz gives us an early look…

Many years ago, when I was a junior professor at Yale, I cold-called a colleague in the anthropology department for assistance with a project I was working on. I didn’t know anything about the guy; I just selected him because he was young, and therefore, I figured, more likely to agree to talk.

Five minutes into our lunch, I realized that I was in the presence of a genius. Not an extremely intelligent person—a genius. There’s a qualitative difference. The individual across the table seemed to belong to a different order of being from me, like a visitor from a higher dimension. I had never experienced anything like it before. I quickly went from trying to keep up with him, to hanging on for dear life, to simply sitting there in wonder.

That person was David Graeber. In the 20 years after our lunch, he published two books; was let go by Yale despite a stellar record (a move universally attributed to his radical politics); published two more books; got a job at Goldsmiths, University of London; published four more books, including Debt: The First 5,000 Years, a magisterial revisionary history of human society from Sumer to the present; got a job at the London School of Economics; published two more books and co-wrote a third; and established himself not only as among the foremost social thinkers of our time—blazingly original, stunningly wide-ranging, impossibly well read—but also as an organizer and intellectual leader of the activist left on both sides of the Atlantic, credited, among other things, with helping launch the Occupy movement and coin its slogan, “We are the 99 percent.”

On September 2, 2020, at the age of 59, David Graeber died of necrotizing pancreatitis while on vacation in Venice. The news hit me like a blow. How many books have we lost, I thought, that will never get written now? How many insights, how much wisdom, will remain forever unexpressed? The appearance of The Dawn of Everything: A New History of Humanity is thus bittersweet, at once a final, unexpected gift and a reminder of what might have been. In his foreword, Graeber’s co-author, David Wengrow, an archaeologist at University College London, mentions that the two had planned no fewer than three sequels.

And what a gift it is, no less ambitious a project than its subtitle claims. The Dawn of Everything is written against the conventional account of human social history as first developed by Hobbes and Rousseau; elaborated by subsequent thinkers; popularized today by the likes of Jared Diamond, Yuval Noah Harari, and Steven Pinker; and accepted more or less universally. The story goes like this. Once upon a time, human beings lived in small, egalitarian bands of hunter-gatherers (the so-called state of nature). Then came the invention of agriculture, which led to surplus production and thus to population growth as well as private property. Bands swelled to tribes, and increasing scale required increasing organization: stratification, specialization; chiefs, warriors, holy men.

Eventually, cities emerged, and with them, civilization—literacy, philosophy, astronomy; hierarchies of wealth, status, and power; the first kingdoms and empires. Flash forward a few thousand years, and with science, capitalism, and the Industrial Revolution, we witness the creation of the modern bureaucratic state. The story is linear (the stages are followed in order, with no going back), uniform (they are followed the same way everywhere), progressive (the stages are “stages” in the first place, leading from lower to higher, more primitive to more sophisticated), deterministic (development is driven by technology, not human choice), and teleological (the process culminates in us).

It is also, according to Graeber and Wengrow, completely wrong. Drawing on a wealth of recent archaeological discoveries that span the globe, as well as deep reading in often neglected historical sources (their bibliography runs to 63 pages), the two dismantle not only every element of the received account but also the assumptions that it rests on. Yes, we’ve had bands, tribes, cities, and states; agriculture, inequality, and bureaucracy, but what each of these were, how they developed, and how we got from one to the next—all this and more, the authors comprehensively rewrite. More important, they demolish the idea that human beings are passive objects of material forces, moving helplessly along a technological conveyor belt that takes us from the Serengeti to the DMV. We’ve had choices, they show, and we’ve made them. Graeber and Wengrow offer a history of the past 30,000 years that is not only wildly different from anything we’re used to, but also far more interesting: textured, surprising, paradoxical, inspiring…

A brilliant new account upends bedrock assumptions about 30,000 years of change: “Human History Gets a Rewrite,” @WDeresiewicz introduces the newest– and last?– book from @davidgraeber and @davidwengrow. Eminently worth reading in full.

* James Baldwin

###

As we reinterpret, we might spare a thought for Vic Allen; he died on this date in 1914. A British human rights activist, political prisoner, sociologist, historian, economist and professor at the University of Leeds, he worked closely with British trade unions, and was considered a key player in the resistance against Apartheid in South Africa. He spent much of his life supporting the South African National Union of Mineworkers (NUM), and was a key mentor to British trade union leader Arthur Scargill, In 2010 Allen was awarded the Kgao ya Bahale award, the highest honor afforded by the South African Union of Miners. After his death he was widely commended by his fellow academics and activists for his lifelong commitment to worker’s rights and racial equality.

source

“An architect should live as little in cities as a painter. Send him to our hills, and let him study there what nature understands by a buttress, and what by a dome.”*…

We’ve misunderstood an important part of the history of urbanism– jungle cities. Patrick Roberts suggests that they have much to teach us…

Visions of “lost cities” in the jungle have consumed western imaginations since Europeans first visited the tropics of Asia, Africa and the Americas. From the Lost City of Z to El Dorado, a thirst for finding ancient civilisations and their treasures in perilous tropical forest settings has driven innumerable ill-fated expeditions. This obsession has seeped into western societies’ popular ideas of tropical forest cities, with overgrown ruins acting as the backdrop for fear, discovery and life-threatening challenges in countless films, novels and video games.

Throughout these depictions runs the idea that all ancient cities and states in tropical forests were doomed to fail. That the most resilient occupants of tropical forests are small villages of poison dart-blowing hunter-gatherers. And that vicious vines and towering trees – or, in the case of The Jungle Book, a boisterous army of monkeys – will inevitably claw any significant human achievement back into the suffocating green whence it came. This idea has been boosted by books and films that focus on the collapse of particularly enigmatic societies such as the Classic Maya. The decaying stone walls, the empty grand structures and the deserted streets of these tropical urban leftovers act as a tragic warning that our own way of life is not as secure as we would like to assume.

For a long time, western scholars took a similar view of the potential of tropical forests to sustain ancient cities. On the one hand, intensive agriculture, seen as necessary to fuel the growth of cities and powerful social elites, has been considered impossible on the wet, acidic, nutrient-poor soils of tropical forests. On the other, where the rubble of cities cannot be denied, in the drier tropics of North and Central America, south Asia and south-east Asia, ecological catastrophe has been seen as inevitable. Deforestation to make way for massive buildings and growing populations, an expansion of agriculture across marginal soils, as well as natural disasters such as mudslides, flooding and drought, must have made tropical cities a big challenge at best, and a fool’s gambit at worst.

Overhauling these stereotypes has been difficult. For one thing, the kind of large, multiyear field explorations usually undertaken on the sites of ancient cities are especially hard in tropical forests. Dense vegetation, mosquito-borne disease, poisonous plants and animals and torrential rain have made it arduous to find and excavate past urban centres. Where organic materials, rather than stone, might have been used as a construction material, the task becomes even more taxing. As a result, research into past tropical urbanism has lagged behind similar research in Mesopotamia and Egypt and the sweeping river valleys of east Asia.

Yet many tropical forest societies found immensely successful methods of food production, in even the most challenging of circumstances, which could sustain impressively large populations and social structures. The past two decades of archaeological exploration, applying the latest science from the land and the air, have stripped away canopies to provide new, more favourable assessments.

Not only did societies such as the Classic Maya and the Khmer empire of Cambodia flourish, but pre-colonial tropical cities were actually some of the most extensive urban landscapes anywhere in the pre-industrial world – far outstripping ancient Rome, Constantinople/Istanbul and the ancient cities of China.

Ancient tropical cities could be remarkably resilient, sometimes surviving many centuries longer than colonial- and industrial-period urban networks in similar environments. Although they could face immense obstacles, and often had to reinvent themselves to beat changing climates and their own exploitation of the surrounding landscape, they also developed completely new forms of what a city could be, and perhaps should be.

Extensive, interspersed with nature and combining food production with social and political function, these ancient cities are now catching the eyes of 21st-century urban planners trying to come to grips with tropical forests as sites of some of the fastest-growing human populations around the world today…

They may be vine-smothered ruins today, but the lost cities of the ancient tropics still have a lot to teach us about how to live alongside nature. Dr. Roberts (@palaeotropics) explains: “The real urban jungle: how ancient societies reimagined what cities could be,” adapted from his new book, Jungle: How Tropical Forests Shaped the World – and Us.

* John Ruskin

###

As we acclimate, we might send thoughtful birthday greetings to Sir Karl Raimund Popper; he was born on this date in 1902.  One of the greatest philosophers of science of the 20th century, Popper is best known for his rejection of the classical inductivist views on the scientific method, in favor of empirical falsification: a theory in the empirical sciences can never be proven, but it can be falsified, meaning that it can and should be scrutinized by decisive experiments.  (Or more simply put, whereas classical inductive approaches considered hypotheses false until proven true, Popper reversed the logic: conclusions drawn from an empirical finding are true until proven false.)

Popper was also a powerful critic of historicism in political thought, and (in books like The Open Society and Its Enemies and The Poverty of Historicism) an enemy of authoritarianism and totalitarianism (in which role he was a mentor to George Soros).

 source

“A culture, we all know, is made by its cities”*…

 

c-atalho-yu-k_after_the_first_excavations_by_james_mellaart_and_his_team

Çatalhöyük after the first excavations

 

Welcome to one of the mothers of all cities, Çatalhöyük, a community on the Anatolian plane that is now part of Turkey. … [Nine thousand] years ago … Çatalhöyük consisted of attached dwellings covering 33 acres. … The city was so new back then, they hadn’t invented the street yet — or the window. So the only way you could get into your apartment was to walk over your neighbors’ rooftops. A ladder was propped against the skylight opening of your apartment.

Çatalhöyük lacked something much more significant than streets and windows. There was no palace here. The bitter price of inequality that the invention of agriculture cost human society had yet to be paid. Here, there was no dominance of the few over the many. There was no one percent attaining lavish wealth while most everyone else merely subsisted or failed to subsist. The ethos of sharing was still alive and well. There is evidence of violence against women and children, but the weakest ate the same food that the strongest ate. Scientific analyses of the nutrition of the women, men, and children who lived here show a remarkable similarity, and everyone lived in the same kind of home. … Dominating [every] room was a giant head of an auroch with massive pointed horns, mounted on the richly painted wall. The walls were lavishly festooned with the teeth, bones, and skins of other animals.

The apartments at Çatalhöyük have a distinctly modern look. The floor plan is highly utilitarian and modular, uniform from dwelling to dwelling, with cubicles for work, dining, entertaining, and sleep. Bare wood beams support the ceiling. It was home for an extended family of seven to ten people…

More of this excerpt from Ann Druyan’s Cosmos: Possible Worlds at “The First Proto-City.” (Via the ever-illuminating delanceyplace.com)

* Derek Walcott

###

As we appraise our antecedents, we might spare a thought for Muhammad; he died on this date in 632.  The founder of Islam, he is considered by its adherents to have been a prophet– the final prophet– sent to present and confirm the monotheistic teachings preached previously by Adam, Abraham, Moses, Jesus, and other prophets.  He united Arabia into a single Muslim polity, with the Quran (the transcriptions of divine messages that he received), as well as his other teachings and practices, forming the basis of Islamic religious belief.

220px-Mohammed_receiving_revelation_from_the_angel_Gabriel

Muhammad receiving his first revelation from the angel Gabriel. From the manuscript Jami’ al-tawarikh by Rashid-al-Din Hamadani, 1307

source

 

 

Written by (Roughly) Daily

June 8, 2020 at 1:01 am

“In a series of forms graduating insensibly from some apelike creature to man as he now exists, it would be impossible to fix on any definite point where the term ‘man’ ought to be used”*…

 

fingerbone

Homo sapiens finger bone, dating back some 86,000 years, found at a site called Al Wusta in Saudi Arabia

 

Darwin turns out to right about the difficulty of dating the emergence of man, not only for the reason he intended (that our emergence from prior species was so gradual as to be indetectable as an “event”) but also because it’s turning out to be difficult to date the earliest examples we can agree are “man” and to figure out when they reached the places they settled…

The Nefud Desert is a desolate area of orange and yellow sand dunes. It covers approximately 25,000 square miles of the Arabian Peninsula. But tens of thousands of years ago, this area was a lush land of lakes, with a climate that may have been kinder to human life.

On a January afternoon in 2016, an international team of archaeologists and paleontologists was studying the surface of one ancient lake bed at a site called Al Wusta in the Nefud’s landscape of sand and gravel. Their eyes were peeled for fossils, bits of stone tools, and any other signs that might remain from the region’s once-verdant past.

Suddenly, Iyad Zalmout, a paleontologist working for the Saudi Geological Survey, spotted what looked like a bone. With small picks and brushes, he and his colleagues removed the find from the ground.

We knew it [was] important,” Zalmout recalled in an email. It was the first direct evidence of any large primate or hominid life in the area. In 2018, lab tests revealed that this specimen was a finger bone from an anatomically modern human who would have lived at least 86,000 years ago.

Prior to this Al Wusta discovery, evidence in the form of stone tools had suggested some human presence in the Nefud between 55,000 and 125,000 years ago. To anthropologists, “human” and “hominin” can mean any of a number of species closely related to our own. The finger bone was the oldest Homo sapiens find in the region.

The bone’s dating contradicts a well-established narrative in the scientific community. Findings, particularly from the area of modern-day Israel, Jordan, and Lebanon, known as the Levant region, have led to the understanding that H. sapiens first made their way out of Africa no earlier than 120,000 years ago, likely migrating north along the Mediterranean coast. These people settled in the Levant and their descendants—or those from a subsequent early human migration out of Africa—traveled into Europe tens of thousands of years later.

Only later, that story goes, did they journey into parts of Asia, such as Saudi Arabia. By some estimates, then, anatomically modern humans would not have been in what is now Al Wusta until about 50,000 years ago.

The fingerbone, then, adds a twist to the tale of how and when our species left the African continent and, with many starts and stops, populated much of the rest of the earth. A new crop of discoveries, particularly from Asia, suggest that modern humans first left Africa some 200,000 years ago, taking multiple different routes…

Politics, geography, and tradition have long focused archaeological attention on the evolution of Homo sapiens in Europe and Africa. Now, new research is challenging old ideas by showing that early human migrations unfolded across Asia far earlier than previously known: “Will Asia Rewrite Human History?

* Charles Darwin, The Descent of Man

###

As we return to roots, we might spare a thought for Jean-Léon-François Tricart; he died on this date in 2003.  A physical geographer and climatic geomorphologist known for his extensive regional studies in numerous countries of Africa.

Tricart was a pioneer in many fields of physical geography including the study of a phenomenon central to the migration of early Homo Sapiens, the major dynamic role of climate in landscape evolution.

Screen Shot 2020-05-04 at 4.41.59 PM source

 

Written by (Roughly) Daily

May 6, 2020 at 1:01 am

%d bloggers like this: