Category Archives: Digital Culture

Bridging the digital divide

Elderly people and technology don’t flock together. They move at different speeds. And yet, Lewis Spiteri has managed to adopt the latest technology. Perhaps it’s his capacity to be curious and critical that has seen him successfully cross the bridge between a world without a phone to using a smartphone.

Lewis, 71, has been using an iPhone as a communication and file-sharing medium for the past five years and has recently also upgraded to an iPad. He also owns a Kindle, even though he still prefers reading a bound book since the scent and feel of the paper draw him deeper into the character.

He has always chosen to remain abreast with evolving technology trends that have within the past decade changed so rapidly. In fact, as we chat, we weave in and out of episodes from his life, for which technology remains a common thread.

Lewis was brought up in Vittoriosa and currently resides in a cosy apartment in Santa Luċija with his wife Josephine. Together, they raised three children and now have seven grandchildren, the eldest of whom is my friend. In fact, we coupled the interview with one of her visits. So on a sunny Saturday afternoon we drive south to be greeted by the bubbly Josephine, who kindly leads us to the living room, where Lewis is sitting on a sofa, enjoying a game of local football on a large screen.

While Josephine can be heard clinking in the kitchen preparing our tea and biscuits, Lewis and I gear up our conversation. He comes across as a courteous and reserved man, one who weighs his words carefully.

“It’s unbelievable,” he tells me. “I started my working life with an afro and now I’m as bald as an onion. I was still a teen when I started out as an apprentice at the dockyard, which was, in my belief, a technology hub. I learnt a lot about technology there. I remember having to come up with my own tools every day, and this skill allowed me to sharpen my thinking and learn how to be innovative. No ship we worked on was identical, so I always woke up to develop new concepts, to fail and try again until something worked. After around 30 years, I moved into an office to work in administration and later as a teacher.

lewis-spiteri-photo
Photo: Yentl Spiteri

“In the 1960s, we taught using a blackboard and a box of white chalk. Eventually, we had coloured chalk, so by the end of class, my hand would resemble a rainbow splashed in coloured dust.

“Thankfully, we soon got anti-dust chalk. By then, the blackboard was also rotational, a new innovation which reduced the annoying process of having to constantly erase what I wrote. I still remember the introduction of the epidiascope. Have you ever seen one? I’m not sure you have. Gone are the days when I used to buy a set of transparencies. I remember I used to prepare the slides at home and then project them on the wall during class. This was a major improvement over the blackboard because we didn’t have to erase everything and the teaching material could be reused. However, this is nothing compared to now. Today I prepare Powerpoint presentations and can also use internet in class.

“The internet made life so much easier for me, especially as a teacher. I remember having to print handouts which I would pass round in class. Now, I gather a list of my students’ e-mail addresses and send them their notes directly. Also, if someone asks a question, I can easily go on You Tube and explain through a video. Looking back, I barely believe how we used to get any work done before. Today, it only takes me five minutes of preparation before a lesson, because all I do is enter the class, switch on my laptop, and I’m set,” he says.

“I have recently bought two external hard drives of 16Gb each, just to make sure I’ll never run out of storage space. It’s unbelievable how external memory has changed the concept of filing. In my time, filing was literally papers, files, cabinets and a lot of physical space. Data retrieval has also become so easy. Before, I used to stress over a paper I might have misplaced, whereas now, all it takes is typing out the first three letters of the name of the document and it’s right there in front of you. Back in the days, only a magician could do that,” he smiles.

In 2000, Lewis was encouraged to read for an MBA.

“To be completely honest with you, I was initially quite hesitant since I didn’t think I would be that competent,” Lewis admits. “You see, my typing skills were close to nil – I used to type with only one finger. However, I went for it. I learnt the computer on my own, because I had no time to go for lessons on how to use it. I remember back then, we still had a tower and a printer which sounded like a stone grinder. And of course, no internet so all my research had to be done manually. I believe a lot in research. When you’re doing your own research, you’re learning how to search and gather information, how to be critical of what you are reading.

“I frequented the public library and the University. Basically, it was all about hard copies. And since a lot of my work was done through distance learning, I had to send my assignments by post. Just imagine how difficult it would have been if I hadn’t adapted to new technologies and learnt how to use the computer. I surely wouldn’t have managed, but I did. I got my MBA. I learnt on my own, the hard way.”

Curious to know at what stage our conversation has arrived, Josephine tiptoes back into the room to switch on the light, and as she leaves, I’m compelled to ask him one final question. How did the two meet?

“I met my wife in Valletta more than 50 years ago. We’re talking about the days when Paceville didn’t even exist. We were both in some teahouse and she caught my attention. I remember she was with a group of girls and I said to myself if I had to date one of them, it would be her. In our time there was no such thing as meeting someone online.

“I think that this is the only negative aspect of technology because social media is shunning us from physical encounters and this is changing human nature. When we meet people face-to-face, we are studying each other not just through the words we say but even through the way we say things. Right now, I’m actually seeing you and you’re answering me in the here and now. I’m listening to your voice, noting your tone – there’s personal contact.

“I think when I first met my wife, it was love at first sight. Well, let’s not call it love at first sight, but I’m sure the emotion exists, because the phrase didn’t come out of nowhere. That deep feeling you get when you see a person for the first time can’t be replaced when seeing someone’s picture online.

“Anyway, Josephine and I didn’t date for a long time, maybe three weeks, or a month at most. Let’s say it was a month. Then I met her parents. Before we got married, I used to go to her house in Birkirkara after work. Remember, we’re still talking about the days when there was no telephone, so I literally had to go to her house in order to see her! And if I had to work overtime, I used to ask the bakery close to where I worked whether it would be okay for them to call and pass on the message.

“When I look back, I realise how life has gone by so fast that I never had the chance to stop and think. I started working at the age of 14, married at 21 and today, I’m 71. I never stopped, I just kept on learning. I believe the human being was created to move forward, to walk, to run. Unless you’re dead, keep going. It’s true there is a lot going on and society is constantly changing fast, but you can never stop progress. I believe that if you don’t accept new technologies, you’re very likely to fall behind, because technology offers so many advantages and it makes life so much easier.”

This interview was originally published in The Sunday Times of Malta on April 13, 2014. 

Advertisements

Why do we take selfies?

Some ten years ago, my childhood best friend and I would head down to our baroque capital each Saturday morning to window shop, gossip and sip strawberry McDonald’s milkshakes while overlooking the spectacular grand harbour views.

Then, we would visit the Savoy shopping mall and as part of our weekly ritual, squeeze ourselves into a photo-booth, insert an Lm1 coin (which would nowadays be roughly the equivalent of €2) and pull funny faces at the automated camera.

In 2003, neither of us had a mobile phone nor a digital camera. The photo-booth was our only means of documenting the outing.

If we were the same teens now, we’d undoubtedly be using our smartphones to capture selfies, and instead of keeping the shameful photos in our wallets (as we did), we’d keep a log of them on instagram for the entire world to admire.

The “selfie” has quickly come to symbolise our culture in 2013.

In fact, the word selfie has recently been included in the Oxford English dictionary as the most influential word of the year.

Here’s the official definition: “(n.) a photograph that one has taken of oneself, typically with a smartphone or webcam and uploaded to a social media website.”

●●

What intrigues me about the selfie is just how an act of vanity is quickly coming to be accepted as a norm by society.

Boys Collage 1

Moreover, none of these people seem to be taking themselves too seriously. The expressions are mainly sexy, mysterious and playful.

How are selfies different in comparison to posing in front of a ‘traditional’ camera?

I’d like to think of the selfie as being very similar to looking into a mirror.

At least whenever I switch on my front-facing smartphone camera to capture a furtive selfie, first thing I do is check that my face is in order, before eventually pouting or squinting at my reflection on the screen.

You see, whenever we look into a mirror, we go through an internal process of scrutinizing our appearance – we try to cover up the elements we dislike, and enhance the attributes we like.

Girls Mirror Selfie 2

However, we tend to do all this in the privacy of our bedrooms or in the bathroom.

We pull faces at ourselves in the mirror, experiment with our hair, try on new make-up, play dress-up – we perform and experiment with different identities within a safe and secure environment.

Now with the selfie, we are placing the behavior considered normal in front of a bedroom or bathroom mirror, into the public sphere.

And this is perhaps one of the reasons why the selfie has sparked up controversy; it is a new phenomenon, one that we love to hate. Purely because the art of selfie taking requires not taking yourself too seriously, acting goofy, and making public what was once carried out in private.

Girls Selfie 1

As a generation, we are the pioneers of the selfie as a means of expression. Meaning: there are those who have already embraced the selfie and harness it (e.g. teens and celebrities). Then there are those who are still testing the waters, and in the process, delaying the selfie from fully becoming a normalised aspect of our culture.

●●

A selfie shared online is simply a process of bringing to the forefront what was once done in the background.

Basically, what the selfie is doing, is unleashing our obsession with self-portraits; it has made what was once invisible, visible across the entire internet universe.

In fact, selfies have always existed, albeit in a different format.

●●

Frida Kahlo was a Mexican painter, best known for her self-portraits.

Through a set of brushes and a vibrant palette, Kahlo depicted how she perceives herself to be, on an external level. In today’s vocabulary, she painted her selfie.

Frida Kahlo Self-Portrait

Painting is nowadays often perceived as time-consuming and expensive. In this regard, the smartphone has democratised the art of self-portraiture to the extent that selfies are taken, modified and shared instantaneously at no cost, whatsoever.

But if we could take pictures of anything, why are we so interested in our faces?

Our face is the organ that distinguishes us from other persons and is crucial for our identity. By flipping the lens and entering into the frame, we come to communicate deep ideas about who we are and where we fit into the world.

One of my favourite, and probably Frida Kahlo’s most famous quotes reads: “I paint myself because I am so often alone, and because I am the subject I know best.”

The selfie is a phenomenon in which the photographer is also the subject of the photograph – just like the self-portrait, but through a different medium.

What is perhaps most gauging about the selfie is the fact that we are given control over how we are seen by the world – definitely lacking in the filter-less photo-booth that had my first selfies taken, ten years ago.

Unplugged: Why do people refuse to connect?

There is a homely Italian hang-out outside the Tal-Qroqq campus, which sits around fifteen people on its black and white plastic tables outside, and maybe another thirty on its two floors inside. At lunch hour on your average Wednesday, the place is full-up. Students, scholars and staff have all gathered to refuel, gossip, network, and surf the net.

To my left, two male students are swiping through photos on their iPads, the girl in the corner is discreetly reading something off her laptop, while the couple behind me is snapping selfies on their smartphone.

I’m guilty as charged – my fingers are typing away at a laptop, while I listen to The Killers on my iPod and attending to e-mails on my smartphone.

Despite the seemingly unstoppable tide of wireless devices that is sweeping our planet, it may seem surprising that there are still people out there who have never used the internet. Today, to connect means to be online. Yet, in the EU, 33 percent of citizens do not have internet access at home, and 29 percent claim they never access the internet. While in another survey, 15 percent of Americans do not use the internet at all.

Who are these black sheep, and why are they not flocking online?

At the moment, we are accepting a worldview wherein adoption of new technology is the norm. Science and technology scholar, Sally Wyatt wrote, “the use of information and communication technology (or any other technology) by individuals, organizations, and nations is taken as the norm, and non-use is perceived as a sign of deficiency to be remedied, or as a need to be fulfilled.”

As figures show, the majority have quickly come to adopt and adapt the internet to their everyday lives, demonstrating that the internet is no longer a luxury, but a given.

Just as users take an active role in shaping a certain technology through its use, non-users also contribute to the configuration of technologies across society and culture. Wyatt explains that it is important to get to know who these non-users are, and more importantly, why they opt not to conform to a rising culture of connectivity.

One would assume that the problem for these citizens is access, therefore making the internet cheaper, and providing education and training would be among the obvious solutions to reduce the amount of digital virgins.

This year, the EU has successfully achieved full broadband access across the entire continent, as part of the European Commission’s Digital Agenda, to make ‘every European digital.’

But, enhancing access is also based on the assumption that internet non-use is a problem to be solved, and once these barriers are overcome, people will embrace the technology with arms wide open.

The way that technology is adopted into our everyday life depends highly on the demographic and psychological characteristics of its users. Like users, a non-user’s age, gender, education and income also plays a role in determining motivations for non-use. When asked about their main motivations, non-users gave a variety of answers for shying away from the internet.

In the recently published Pew survey (US), 34 percent of participants do not use the internet because they feel it is not pertinent to their lifestyles. They claim to be disinterested and do not want to make use of the said technology. Others mention a concern about privacy (virus, hackers, spam) or that it was frustrating or difficult to use. As least, those who are offline are aware of the value of the internet: 44 percent of these offline adults said they have asked a friend of family member to look something up, or complete a task on the internet for them.

Interestingly, even those who do not have a computer nor plan to use the internet in the near future express a belief that computer skills are becoming a necessity – even if they could not articulate activities for which they could potentially use the computer.

Age is a major factor of internet usage and unsurprisingly, these people tend to stem from an older generation. 44 percent of offline Americans are older than 65, while only 2 percent are between 18 and 29 years old.

Moreover, those with lower incomes or lesser education are also more likely to be offline, as well as those whose future goals are less clear than those of adopters. From the 34 percent of offline Americans, there are those who are constrained by financial reasons (19 percent) or lack of physical availability or access to the internet (7 percent), which could even mean illiteracy.

Wyatt draws a distinction between non-users, that is, those who do not have access to a respective technology, and the want-nots, those who consciously resist or reject a technology. She explains that it is the latter group which, if paid sufficient attention to, can help in diversifying and enhancing technology.

Truth be told, I formed part of the want-nots until earlier this year, for I had refused to venture into the smartphone world. For a number of years, I went by with using a Nokia phone whose most exotic feature was a torch light. As long as it fulfilled my basic need to send and receive texts, I was happy.

However, in the restraining eye of society, I was excluding myself from the 60 percent of 16 to 24 year olds across the EU, who accessed the internet on the move. I was a black sheep – a non-adopter of new technology.

There was a personal choice that separated me from the rest of my fellow contemporaries. To be frank, I can’t say that I wasn’t intrigued by smartphones, but I had my doubts. Apart from being expensive, I thought a smartphone would be intrusive and I very much appreciated the notion that when I’m out of the house, I’m completely disconnected from the internet.

Usually, diffusion of new technologies and behaviours across society occur through a process of modeling and social influence. I was part of the diffusion, but as a spectator – a rebel of wireless internet technology. I wasn’t ready to take the plunge and have my ‘life changed’ in such a short period of time. I was protesting against the idea of being constantly connected, and at times, I romanticized over the beauty of letter-writing and instead of falling victim to the future, I fell into the trap of nostalgia. Alvin Toffler would diagnose my behavior as a symptom of future shock.

During the past five years, wireless technology use has diffused across society becoming a ubiquitous symbol of today’s culture. In this day and age, a high-speed internet connection is not merely restricted to the haven of our homes, or the conditioned air in our offices. Public spaces have also come to embrace wireless technology access. In fact, a lot of cafés, recreation centres, and village squares offer free Wi-Fi and accessible power sockets to change our devices, encouraging people to pull out their devices and stay connected, whenever, wherever.

Everywhere I went and whoever I was with, I was followed by this unspoken pressure to conform. Evolutionary psychology repeatedly shows how our basic human motive is to connect, and this is what eventually drove me into buying a smartphone: the basic need to connect.

I was getting tired of the resistance, which slowly made me feel like a grandma living in a twenty-something’s body. I had to adopt to a world with smartphones by getting one too. Having a smartphone made me feel connected, part of something. As superficial as it may sound, I belonged.

What is interesting is that when, as part of a personal research I conducted, I asked my friends why they had decided to buy a smartphones, I was met with ambivalent answers. Essentially, they explained that they don’t feel the need to own a smartphone and would willingly give it up, however, it makes it much easier for their friends to reach them.

So there: it’s not that they really needed their smartphones, it’s the tide in the wake of the culture of connectivity that’s swept them in.

After his year of self-imposed exile of the internet, The Verge journalist, Paul Miller came to realize just how easier the internet makes it to feel a relevant part of society. Without the internet, he fell ‘out of sync with the flow of life.’ “The internet isn’t an individual pursuit,” he writes, “it’s something we do with each other. The internet is where people are.”

Even though back in Tal-Qroqq, we were all sitting in the same café, only to ignore each other, we were all connecting, through the internet.

When in the mid-1980s, Joshua Meyrowitz wrote “to be out of touch in today’s world, is to be abnormal” the smartphone was still a product of science-fiction. Today, it is but a mainstream commodity becoming the most rapidly adopted technology in human history all for but a main reason: the internet.

In 2013, the internet is all embracing. It’s unavoidable. Everything is the internet: we are the internet. Manuel Castells said that it is difficult to go back to a pre-networked society, just as we cannot go back to a world without electricity.

Connectivity is no longer something abstract, it has fashioned itself into a state of mind. Now we are tethered to the rest of the world through the internet enclosed in a pocket-sized device. For better or for worse, the internet has changed our lives forever. Despite the digital divide, in the future, everyone will be online.

And in my earphones, The Killers front-man echoes the words: “This is the world that we live in/no we can’t go back.”

Originally published in the  The Sunday Times of Malta on November 10, 2013. 

Does foodtography ruin our appetite?

Over the past two years, my social media feeds have more or less evolved into a culinary still-life expo. We’ve gone from Facebook to Recipebook. But in truth, why are we meticulously documenting our culinary adventures and sharing them with a virtual public?

A leisurely scroll along what was once a cacophony of people’s concerns and whereabouts has suddenly become more visual – and it’s not merely selfies, but also what people are eating. Because let’s face it, even Nanna’s lampuki pie deserves to have its online moment.

Foodtography is the relatively recent trend of taking pictures of food and sharing them online via social media platforms such as instagram, Facebook, Twitter and Pinterest. What I find particularly interesting about this phenomenon is that the photos generally feature food, sans people.

A quick browse through my childhood photo albums shows pictures of people seated at long tables during summer barbecues and anniversary fenkati. But whatever the occasion, the main actor was not food – the focus was on the people and the eating experience as a whole.

On the other hand, with foodtography, the food has become the subject of the photograph, with most photos excluding the diner. Social media does what food does best – it brings people together.

This concept is pertinent in marketing and advertising strategies. Take Foodspotting for instance – this app, integrated with a map of restaurants close to your current location, showcases dishes that people have eaten. The app tagline – find and share great dishes, not just restaurants – encourages diners to shoot, tag and rate dishes under the #foodspotting hashtag. Then restaurants can promote their food while enticing new clients, for free.

Unless, of course, the people you are eating with believe that taking pictures of your food spoil the atmosphere of the meal. Food alone is a basic need for nourishment and survival, but eating together is deeply rooted in human culture. People who gather around a table are present to share more than just a meal, but also a conversation. People come together for special occasions and construct collective memories and experiences over food.

Professor Signe Rousseau from Cape Town University, South Africa, believes that: “Most of us love to eat, and we also love to tell stories through food. We all know that a picture is worth a thousand words, and as communication is becoming increasingly visual, we rely on others to make sense and interpret the food we share.”

Perhaps this is what we are trying to emulate through foodtography – a sense of virtual togetherness.

Self-proclaimed foodie Kim Davidson from Brooklyn, the US, recently ventured into people’s motivations behind foodtography. A former avid foodtographer herself, she explains that: “By combining photography with our storytelling capability, we are able to easily build discourses, especially for those who cannot partake in the meal with us.”

Photos capture special moments, thus providing information to those who aren’t present. Moreover, sharing such moments with an online audience enables people to engage in a discourse where personal memories are cued by photographs. “People’s relationship with food does not only satisfy our biological needs,” she continues. “It is also a profoundly social urge.”

Based on the ethos that sharing is caring, the internet and social media have created a virtual platform for foodie communities to gather and exchange their love and appreciation of good food. “Social media and food have one unique and seemingly genuine commonality, that of integrating people,” Davidson says. Indeed, social media does what food does best – it brings people together.

In this way, foodtography could also be perceived as a means of attracting people to one’s profile, increasing the chances of interaction via likes and comments, and thus satisfying one’s need for recognition. Additionally, the saying “you are what you eat” could also sustain the claim that foodtography could be linked to the online shaping of our identity.

Recently, a group of researchers from Brigham Young University, Utah, the US, found that an obsession with foodtography could be spoiling our appetite. They claim that looking at too many photos of food can make our eating less enjoyable due to sensory boredom.

As far-fetched as this may seem, there might actually be a grain of truth here. After a whole morning shooting irresistible dishes for a restaurant’s new menu, a food photographer friend of mine told me: “I didn’t eat anything for lunch. It felt like my body had already digested the food.”

Pictures are a representation of our environment – they have the ability to evoke emotions and may thus seem to reproduce reality. In this way, when we trawl through foodies’ profiles, our bodies could be fooled into experiencing the food as if it were present in front of us. If you pay close attention, you might realise that you start to salivate as a result of our body’s physiological reaction.

By the end of 2010, 80 billion photos were published on social media platforms – that goes to explain how nowadays, a lot of people don’t just write about what they’re up to – smartphones have facilitated visual communication, such that people also share photos of what they think, do and eat.

Foodtography has also facilitated the exchange of recipe ideas and created a whole new realm for advertisers. Moreover, food diaries may also eliminate the sense of loneliness one may feel when eating alone. However, we must remember to enjoy the company of others during a meal, since taking photos of food can alter the atmosphere when actually eating together.

This article was originally published in The Times of Malta, October 23, 2013.

A Cyberspace Odyssey

I’m sure that last October 14, most of you were sitting in the eight-million-strong audience watching Felix Baum­gartner’s record-breaking jump live on YouTube. The 43-year-old Austrian skydiver float­ed for two hours in a pur­pose-built capsule towed by a helium balloon before taking a giant leap from 128,000 feet.

I watched him for an hour before the jump, on a 15-inch laptop screen, while sitting with three of my housemates in our kitchen while nibbling freshly baked apple cake.

“It’s just like a movie,” exclaimed Tobi. And it’s true, because the idea of watching a 21st-century daredevil fall through the clouds seemed more like science-fiction than reality.

My adrenals were pumping as though I was there, perched on his shoulder. Physically, I was in one place, but at the same time, I was totally immersed in another.

This dichotomy is experienced even more vividly when we’re online. So if we’re increasingly experiencing our world through various media, does it mean that life is becoming more virtual?

The history of media and technology has been driven by our quest for immediacy. The internet has refashioned and extended upon earlier media – mail became e-mail, telephone conversations turned into Skype calls, television and radio became YouTube and Spotify, and our printed photos became albums stored in the cloud.

Spanish sociologist Manuel Castells states that we are moving from virtual realities to real virtualities. He writes how, “we are not just on the screen through which experience is communicated, but we become the experience”.

The advent of new media has brought about a convergence of different dimensions of communication spanning the globe, which are blurring the boundaries between the real and the virtual realm. As a result of this, we Instagram our food to make our friends’ mouths water at lunch.

And we no longer have to wait for the evening news – social media and live blogs posted by citizen journalists take us directly on location, when and where the news is happening, in real time.

We also call our friends and relatives abroad for free and watch them as they speak to us. As a result, our experience of reality – the here and now – is affected on a sensory as well as on an experiential level. If we spend whole days on Skype with relatives abroad, this can alienate us from living the full reality of our physical surroundings, in turn making the physical world more virtual and the digital more real.

While watching Baumgartner hovering through the air, the conversation with my housemates turned to the concept of the internet. I asked my housemates whether they consider the internet to be real or virtual, to which I was met with quizzical looks as they pondered on answers which none of us had.

For most of us, any notion of how all this information arrives in our homes and workplaces is weirdly immaterial. The world behind the technology of the internet is something many of us fail to think about. It is taken for granted – we just don’t think about the why and the how. It just is.

Normally, we think about things when something goes wrong, like realising that your car runs on four wheels when you get a puncture.

In fact, journalist Andrew Blum started wondering what the internet was really made of when a squirrel chewed through a cable and knocked him offline. His recent publication, Tubes: A Journey to the Centre of the Internet (Ecco) is an account of his two-year quest to uncover the physical world on which our digital lives are built.

During a TED Conference in Edinburgh last June, Blum recounted how over the past decade, his relationship with the physical world and his surroundings has changed. He started realising how he is spending less time out in the world and more time sitting in front of his computer screen. He also observed how our attention is constantly divided between real and virtual, both from looking inside our screens and outside the world around us. What was striking to Blum was that the world inside the screen seemed to have no physical reality of its own – it was cyberspace.

Cyberspace is a transcendent idea that has changed everything from shopping to dating. We cannot spatially locate cyberspace or perceive it as a tangible object, yet it is still real in its effects. The word was first used in the mid-1980s by American-Canadian science fiction writer William Gibson and it is nowadays used as a metaphor for the internet to give ourselves a sense of space and orientation.

Our brains are driven by meaning, and when something is too abstract to comprehend, we need to ground it in concrete terms. Cyberspace has a strange physicality, a place where people, albeit disembodied, meet and exchange information. Thus, we can say that cyberspace is an electronic landscape incorporating two worlds – the sensorial world of organically human and the digitised immaterial world.

Blum further explains how to him, the world beyond the screen is a kind of Milky Way – we are so small when compared to the galaxy that we cannot grasp it in its totality. Moreover, by spending hours online, we are easily immersed in this parallel universe we call cyberspace and more often than not ignore our physical presence.

Blum uncovered much of the mystery behind cyberspace by visiting the physical places that make the internet a living reality, such as 60, Hudson Street, New York where, he says, the giant networks of the internet are housed. Blum also travelled to Portugal where he saw the undersea cables connecting Europe and America being fixed. “My search for the internet has therefore been a search for reality,” he writes in his book.

After Baumgartner landed safely, my housemates and I started to wrap up our discussion. Sybil’s initial reaction had been that, no, the internet is obviously not real. But then, her ideas became more nuanced once Ján suggested it is real because he could touch the servers.

It’s true that we have grown into such a hyper-mediated world that we do not even realise the inherent paradox between real and virtual. By the end of it, we decided that the world beyond our screens seems to be both real and virtual because the exchanges of information and experiences made online happen between humans.

As Blum puts it, “A journey is really understood upon arriving home. What I understood (is that) the internet wasn’t a physical world or a virtual world, but a human world.”

This article was originally published by The Sunday Times of Malta, Sunday January 13, 2013. The featured image belongs to The Internet Mapping Project.

Disconnect to Reconnect?

A couple of months ago, I travelled to Alsace and stopped for one night in Paris.

I was staying at a youth hostel, and before I dozed off, I overheard an Australian in my dorm whisper, “Hey, there’s no wi-fi!” His friend replied, “Dude, you’re in Paris. Why the hell do you need wi-fi?”

And I thought about how two good friends backpacking across Europe and savouring its scenery, history and culture, still felt the need to be connected elsewhere.

Somewhere in the corners of the Australian backpacker’s mind hung the potential for a different connection and the looming fear that he was missing out on something that was happening elsewhere; something that he would never know unless he logged on.

With smartphones connecting us to the internet directly from our pockets, we now have the ability to span distances – the potential of acquiring a different connection within a pinch and a tap on a small screen is closing down the borders between virtual and physical space.

Social media provides us with a platform through which we can share content at no cost, to a boundless audience.

For instance, I wake up to see pictures of what my friends in Asia are having for breakfast, or what another friend bought while shopping in Paris or London – all this in real-time even though physically we’re in different time-zones.

Nonetheless, our perceived level of interconnectedness is only psychological.

What we are inherently creating via social media is what blogger Nicholas Scalice called the “Biggest, most engaging conversation in the history of human communication.” Social media has not opened a window but a horizon for self-disclosure. But what exactly are we getting out of sharing ourselves online?

By nature, we cannot help but share our subjective take on things, no matter who is listening. Statistics show that 40 per cent of our conversations are about the self, and the popularity of social media might be related to our primal urge of talking about ourselves.

In fact, recent neuroscientific research demonstrates that acts of self-disclosure were accompanied by spurts of heightened activity in brain regions, belonging to the meso-limbic dopamine system, which is associated with the sense of reward and satisfaction we also obtain from food, money or sex. Thus, the brain is positively reinforced and that is why we find talking about ourselves so enjoyable.

The habit of online self-disclosure is not necessarily taken up by people who are bored or in need of company. A survey conducted by T-Mobile in the UK has shown that people are sharing their lives online even while on holiday. I would think that people travel to get away from the stresses and routines of home, and yet 60 per cent of Britons admittedly log on to Facebook or Twitter while on holiday, specifically to boast about what they are up to.

Smoasters (neologism: social media + boasters), was coined to refer to people who use social media to talk with excessive pride and self-satisfaction about their achievements, possessions or abilities.

Yet updating others while on holiday is not a new trend. Take the early 14th century Italian poet Petrarch, for instance. He documented his ascent to Mt Ventoux in France, describing the journey to the summit and the views over the Rhone to the bay of Marseilles.

It could be argued that if the same poet had to climb the same mountain today, he too would tweet verses about it. Of course: But would his subjective experience of the ascent be the same, or would it be existentially different?

Petrarch had the luxury of being alone, to process and reflect about his experience without being interrupted by other peoples’ updates rolling in, on his Newsfeed. Sometimes, I feel that we may be losing the beauty of the “now” because we are constantly pining for a different connection, possibly triggered by the fear of missing out.

Nobody can wait anymore, not because we can’t, but because we don’t have to.

Then again, new technology always sparks up some sort of controversy, possibly instilled by an intrinsic fear churned by our ignorance or misunderstanding of it. Nonetheless, we have always adapted it to our needs.

Sherry Turkle wrote that our relationship with technology is still in its infancy and evolving gradually.

Moreover, Howard Rheingold, in his recent publication Net Smart, encourages us to continue growing in this symbiotic relationship by learning to use media intelligently, humanely and mindfully.

Originally published on The Sunday Times of Malta on August 26, 2012.

What’s in a Meme?

“Footballer: Y U no use foot?” Indeed. That is the sound of memes, spreading like wildfire through my Facebook newsfeed. As if the Euro 2012 wasn’t time-consuming enough. Now there’s a meme featuring nearly every player, every goal, and documenting every other minute of every match.

In this context, memes are graphics with large text in front of a related illustration, created for free, using templates from meme-generator websites.

Memes are becoming a central part of our everyday landscape of communication, almost replacing the traditional textual status update.

Their content – generally referring to cultural symbols – entertains and consequently questions aspects of society through their captions. Memes are in fact shared instantly via social networks (namely Facebook, Tumblr and Youtube), and can take the form of hyperlinks, videos or pictures. For the purpose of brevity, I will only refer to graphical memes.

Memes are simply becoming a new way of passing on humour: jokes were once passed on by word of mouth and are today being spread virally as visual genres of expression. In the past, we may have laughed at one-liner jokes. Today we are giving these jokes a face.

Memes show how internet users are developing a particular creative intelligence which couldn’t have existed elsewhere. Meme enthusiasts develop a knack for observing and picking up humoristic cues in photos, film quotes, and pop references, combining them together and immortalising them into a meme related to a certain theme.

Almost every university worldwide, including ours, has its own meme community page, where students create and share a meme to purge their frustration over exams or student life in general.

Locally, the Paceville Malta Meme Page is the largest Facebook community, hosting over 3,000 users who share, like and comment on memes posted by other members.

My relationship with memes is like a double-edged sword. I find them equally comical and pointless – on one hand, they are trivial, ridiculous collages of different media types, yet on the other they are incredibly witty.

Ray Bradbury, the late science-fiction author, was right when he said that ours is a culture and a time that is immensely “rich in trash, as much as it is in treasures.”

Thanks to the copy-and-paste technique, artists and creative professionals now have the luxury of modifying or commenting on each other’s work. Audiences are no longer passive consumers but also creators and innovators of content, which is increasing the amount of online competition.

With everyone fighting to be heard above the cyberspace noise, the service of passing on a message that is equally interesting and meaningful is becoming more and more challenging.

American journalist Matt Labash criticises this idea of what he calls ‘copycatting’, stating that when we are not recycling our own memes, we are still dependent on, “Non-internet-generated material from old-school media dinosaurs”.

But then again, doesn’t the whole world borrow ideas?

“Substantially all ideas are second-hand,” Mark Twain observed, “consciously and unconsciously drawn from a million outside sources.”

Lawrence Lessig, American academic and author of The Future of Ideas, believes that copyright laws can be a threat to innovation: “Since the future always builds upon the past.” According to Lessig, all members are producers who continually consume, remix and produce material.

British blogger Andrew Sullivan contends that the remix culture “teaches that making derivative work can be a form of real originality, and not that all derivative works are original.”

So there, copycatting may not necessarily be a threat to creativity.

The culture of remixing allows and encourages the public to add, combine and modify existing material to produce a new product or meme. On the upper hand, the remix and participatory culture reaps significant social benefits. It is cultivated by the philosophy that ‘sharing is caring’, further enhancing the sense of community within a global village. By accepting input from all the participants, culture will become richer in diversity and more inclusive.

In this respect, the internet is so powerful that we can create something and share it directly with our audience. It provides us with a digital platform to share our work and receive immediate feedback. This is something Leonardo da Vinci, Monet and Beethoven didn’t have – an inherent relationship with their audience.

In an economic-crisis where the majority of youth are unemployed, generating memes is an ideal way to be creative. Memes are created at no cost and generate immediate feedback. The fact that the internet allows for so much freedom encourages people to keep trying to create something.

As seen in The Sunday Times of Malta, Sunday, June 24, 2012.