Category Archives: Digital Culture

Bridging the digital divide

Elderly people and technology don’t flock together. They move at different speeds. And yet, Lewis Spiteri has managed to adopt the latest technology. Perhaps it’s his capacity to be curious and critical that has seen him successfully cross the bridge between a world without a phone to using a smartphone.

Lewis, 71, has been using an iPhone as a communication and file-sharing medium for the past five years and has recently also upgraded to an iPad. He also owns a Kindle, even though he still prefers reading a bound book since the scent and feel of the paper draw him deeper into the character.

He has always chosen to remain abreast with evolving technology trends that have within the past decade changed so rapidly. In fact, as we chat, we weave in and out of episodes from his life, for which technology remains a common thread.

Lewis was brought up in Vittoriosa and currently resides in a cosy apartment in Santa Luċija with his wife Josephine. Together, they raised three children and now have seven grandchildren, the eldest of whom is my friend. In fact, we coupled the interview with one of her visits. So on a sunny Saturday afternoon we drive south to be greeted by the bubbly Josephine, who kindly leads us to the living room, where Lewis is sitting on a sofa, enjoying a game of local football on a large screen.

While Josephine can be heard clinking in the kitchen preparing our tea and biscuits, Lewis and I gear up our conversation. He comes across as a courteous and reserved man, one who weighs his words carefully.

“It’s unbelievable,” he tells me. “I started my working life with an afro and now I’m as bald as an onion. I was still a teen when I started out as an apprentice at the dockyard, which was, in my belief, a technology hub. I learnt a lot about technology there. I remember having to come up with my own tools every day, and this skill allowed me to sharpen my thinking and learn how to be innovative. No ship we worked on was identical, so I always woke up to develop new concepts, to fail and try again until something worked. After around 30 years, I moved into an office to work in administration and later as a teacher.

lewis-spiteri-photo
Photo: Yentl Spiteri

“In the 1960s, we taught using a blackboard and a box of white chalk. Eventually, we had coloured chalk, so by the end of class, my hand would resemble a rainbow splashed in coloured dust.

“Thankfully, we soon got anti-dust chalk. By then, the blackboard was also rotational, a new innovation which reduced the annoying process of having to constantly erase what I wrote. I still remember the introduction of the epidiascope. Have you ever seen one? I’m not sure you have. Gone are the days when I used to buy a set of transparencies. I remember I used to prepare the slides at home and then project them on the wall during class. This was a major improvement over the blackboard because we didn’t have to erase everything and the teaching material could be reused. However, this is nothing compared to now. Today I prepare Powerpoint presentations and can also use internet in class.

“The internet made life so much easier for me, especially as a teacher. I remember having to print handouts which I would pass round in class. Now, I gather a list of my students’ e-mail addresses and send them their notes directly. Also, if someone asks a question, I can easily go on You Tube and explain through a video. Looking back, I barely believe how we used to get any work done before. Today, it only takes me five minutes of preparation before a lesson, because all I do is enter the class, switch on my laptop, and I’m set,” he says.

“I have recently bought two external hard drives of 16Gb each, just to make sure I’ll never run out of storage space. It’s unbelievable how external memory has changed the concept of filing. In my time, filing was literally papers, files, cabinets and a lot of physical space. Data retrieval has also become so easy. Before, I used to stress over a paper I might have misplaced, whereas now, all it takes is typing out the first three letters of the name of the document and it’s right there in front of you. Back in the days, only a magician could do that,” he smiles.

In 2000, Lewis was encouraged to read for an MBA.

“To be completely honest with you, I was initially quite hesitant since I didn’t think I would be that competent,” Lewis admits. “You see, my typing skills were close to nil – I used to type with only one finger. However, I went for it. I learnt the computer on my own, because I had no time to go for lessons on how to use it. I remember back then, we still had a tower and a printer which sounded like a stone grinder. And of course, no internet so all my research had to be done manually. I believe a lot in research. When you’re doing your own research, you’re learning how to search and gather information, how to be critical of what you are reading.

“I frequented the public library and the University. Basically, it was all about hard copies. And since a lot of my work was done through distance learning, I had to send my assignments by post. Just imagine how difficult it would have been if I hadn’t adapted to new technologies and learnt how to use the computer. I surely wouldn’t have managed, but I did. I got my MBA. I learnt on my own, the hard way.”

Curious to know at what stage our conversation has arrived, Josephine tiptoes back into the room to switch on the light, and as she leaves, I’m compelled to ask him one final question. How did the two meet?

“I met my wife in Valletta more than 50 years ago. We’re talking about the days when Paceville didn’t even exist. We were both in some teahouse and she caught my attention. I remember she was with a group of girls and I said to myself if I had to date one of them, it would be her. In our time there was no such thing as meeting someone online.

“I think that this is the only negative aspect of technology because social media is shunning us from physical encounters and this is changing human nature. When we meet people face-to-face, we are studying each other not just through the words we say but even through the way we say things. Right now, I’m actually seeing you and you’re answering me in the here and now. I’m listening to your voice, noting your tone – there’s personal contact.

“I think when I first met my wife, it was love at first sight. Well, let’s not call it love at first sight, but I’m sure the emotion exists, because the phrase didn’t come out of nowhere. That deep feeling you get when you see a person for the first time can’t be replaced when seeing someone’s picture online.

“Anyway, Josephine and I didn’t date for a long time, maybe three weeks, or a month at most. Let’s say it was a month. Then I met her parents. Before we got married, I used to go to her house in Birkirkara after work. Remember, we’re still talking about the days when there was no telephone, so I literally had to go to her house in order to see her! And if I had to work overtime, I used to ask the bakery close to where I worked whether it would be okay for them to call and pass on the message.

“When I look back, I realise how life has gone by so fast that I never had the chance to stop and think. I started working at the age of 14, married at 21 and today, I’m 71. I never stopped, I just kept on learning. I believe the human being was created to move forward, to walk, to run. Unless you’re dead, keep going. It’s true there is a lot going on and society is constantly changing fast, but you can never stop progress. I believe that if you don’t accept new technologies, you’re very likely to fall behind, because technology offers so many advantages and it makes life so much easier.”

This interview was originally published in The Sunday Times of Malta on April 13, 2014. 

Advertisements

Does tech detract childhood experiences?

My friend’s son Billy is now two-and-a-half years old. He is familiar with the Youtube app on his dad’s iPhone and knows it is a source of cartoons. Once his dad renders a search, he can scroll through similar videos on his own and chooses the one he would like to watch. He knows a smartphone can take photos, and often requests a selfie.

Even though still very young, Billy is relatively savvy with his dad’s smartphone. But he’s not the exception: most kids his age are pretty much the same. If Billy is more likely to be the rule in today’s touch-screen based and app-dominated society, what are the implications for today’s kids in their experience of childhood?

Childhood is an exciting period of exploration. Through play, children develop and categorise their thoughts. They explore surroundings, materials, social realities and situations in a safe and unthreatening environment. Play offers social, emotional and intellectual development which is crucial during any child’s early years.

During our first two years of life, the brain is highly malleable and the number of connections formed during this period are formative for the child’s future and learning. Thus, the natural conclusion would be that since the Internet is the present and the future, children should be exposed to tablets and smartphones as early as possible so as to prepare them for the tech-saturated lifestyle beyond their strollers.

Even though educational cartoons and video-games contribute positively to a child’s knowledge, problem-solving and academic abilities, consumption must be selective. International studies demonstrate that too much early exposure to screen technology may inappropriately stimulate the developing brain in such a way that it may be harmful.

Play therapist Jacqueline Abela DeGiovanni suggests that parents are often unprepared for the challenge, especially when it comes to regulating their child’s time with such technologies.

“My work includes working closely with families, and I do believe that excessive television viewing and offering little or no time regulation when it comes to any use of tech gadgets is proofing to be somewhat problematic in our society,” she says.

Some of the health implications include a decline in physical activity, which could lead to child obesity and also a decline in pretend or make-believe play.

“Technology could be binding children indoors with little or no time to physical exercise and social interaction. Play activities, such as pretend play and make-believe play are on the decrease due to children preferring to use technology gadgets to occupy their time.”

For me, growing up in the 1990s meant that a home-computer was an essential part of our home décor, much like having a fridge in the kitchen. And listening to my mother narrating stories of her childhood, filled with days playing with neighbours in the streets of her village somehow always made me nostalgic for her past. By the age of four, I was already spending most of my time indoors, playing games off a floppy disc with quirky 2D graphics and annoying sounds, while my mother holds memories of childhood that are based on authentic experiences and intimate friendships.

From my experience, I don’t think that technology is necessarily the cause of a decline in physical and make-believe play, but rather the effect of an accelerated lifestyle. With the traffic laden roads and compact housing bearing no room for a garden, and the very little neighbourhood fields left where children could roam around, climb trees and play freely with one another, it is not surprising that they are resorting to technology to play and interact.

Speaking with Billy’s dad, he explains that his son still loves toys and spends much more time playing with cars and trains or reading books, than on a smartphone. But of course it’s his parents who are consistently trying to instil a sense of discipline and giving their son ‘time-out’ from tech.

“He is definitely more excited about a friend coming over to play rather than watching TV or a video on Youtube,” Billy’s dad says.

This shows that children still have the natural capacity to create their own games and play with one another, despite being brought up in an electronic era. It is the fast-paced lifestyle that our society exposes us to that permits them from freely doing so.

Today’s children are growing up in a digitally distracting environment and it’s more than obvious that this isn’t going to wipe out. It is thus understandable that parents should also move with the times, but not entirely. The key is to opt for a balanced approach between technology consumption and traditional forms of play.

Moreover, we must distinguish between passive and active media consumption since the media landscape has changed drastically over the past decades: television and radio now compete with tablets and gaming consoles, which are far more engaging on a cognitive level, than the passive consumption of television and radio.

Screens are but a delivery mechanism and what is important is to focus on the content that they are conveying.

In fact, the Centre for Child, Health, Behaviour and Development at the Seattle’s Research Institute in the US studied two groups of children aged 18-24 months: one who was asked to play with blocks, and the other was asked to watch television. The former group scored significantly better on subsequent language tests. This illustrates how an interactive component during play promotes language development.

Dr Dimitri Christakis, the Director of the Seattle Research Institute is currently set to replicate this study, in order to compare the effects of TV and interactive iPad games. Even though the research is still in progress, Dr Christakis predicts that the effect of tablets on the brain will be much closer to the blocks than the television. This could suggest the potential touch-screen devices might yield as academic tools.

So, instead of fully depriving our children from using or interacting through new technologies in favour of traditional play, we need to help make use it mindfully – perhaps even through the help of Facebook pages such as APPropriate, which is aimed at providing information about suitable iPad or Android applications young children will love learning from.

Speech and language pathologist Veronica Montanaro, who specialised in augmentative and alternative communication, has recently studied the way children below the age of three interact with an iPad and established developmental milestones which would certainly be helpful to parents, professionals and application developers to understand age-appropriate iPad use. The study reveals the ages at which iPad-specific behavior emerge including attention skills, behavior, cognitive skills, language and communication skills, exploration of the iPad, posture and iPad handing and motor skills.

Billy’s generation is the first to grow up in a networked society with constant connectivity. The way they will process and present information, as well as their expectation of one another will undoubtedly be different from any of our childhoods. Different doesn’t necessarily mean worse.

Detractors may fear that today’s children will eventually grow up to be zombies, lacking interpersonal and social skills, but I believe this is a generalised overstatement. Given the right tools, within the right context and timing, Billy’s generation could potentially be the smartest and most creative – but they cannot accomplish this alone.

Billy’s dad believes in the importance of restricting the amount of tech consumption during the day.

“We should look into exposing the younger generation to more pretend, make-believe and outdoor play,” says Ms Abela DeGiovanni. “My recommendation to parents is to do their utmost to set a time limit on the use of technology gadgets.”

If you’re there to guide your children, these devices could be an opportunity for both you and your child to bond and learn from one another.

This was originally featured in The Sunday Times of Malta on February 16, 2014. 

Why do we take selfies?

Some ten years ago, my childhood best friend and I would head down to our baroque capital each Saturday morning to window shop, gossip and sip strawberry McDonald’s milkshakes while overlooking Valletta’s spectacular grand harbour views.

Then, we would visit the Savoy shopping mall and as part of our weekly ritual, squeeze ourselves into a photo-booth, insert an Lm1 coin (which would nowadays be roughly the equivalent of €2) and pull funny faces at the automated camera.

In 2003, neither of us had a mobile phone nor a digital camera. The photo-booth was our only means of documenting the outing.

If we were the same teens now, we’d undoubtedly be using our smartphones to capture selfies, and instead of keeping the shameful photos in our wallets (as we did), we’d keep a log of them on instagram for the entire world to admire.

The “selfie” has quickly come to symbolise our culture in 2013.

In fact, the word selfie has recently been included in the Oxford English dictionary as the most influential word of the year.

Here’s the official definition: “(n.) a photograph that one has taken of oneself, typically with a smartphone or webcam and uploaded to a social media website.”

●●

What intrigues me about the selfie is just how an act of vanity is quickly coming to be accepted as a norm by society.

Boys Collage 1

Moreover, none of these people seem to be taking themselves too seriously. The expressions are mainly sexy, mysterious and playful.

How are selfies different in comparison to posing in front of a ‘traditional’ camera?

I’d like to think of the selfie as being very similar to looking into a mirror.

At least whenever I switch on my front-facing smartphone camera to capture a furtive selfie, first thing I do is check that my face is in order, before eventually pouting or squinting at my reflection on the screen.

You see, whenever we look into a mirror, we go through an internal process of scrutinizing our appearance – we try to cover up the elements we dislike, and enhance the attributes we like.

Girls Mirror Selfie 2

However, we tend to do all this in the privacy of our bedrooms or in the bathroom.

We pull faces at ourselves in the mirror, experiment with our hair, try on new make-up, play dress-up – we perform and experiment with different identities within a safe and secure environment.

Now with the selfie, we are placing the behavior considered normal in front of a bedroom or bathroom mirror, into the public sphere.

And this is perhaps one of the reasons why the selfie has sparked up controversy; it is a new phenomenon, one that we love to hate. Purely because the art of selfie taking requires not taking yourself too seriously, acting goofy, and making public what was once carried out in private.

Girls Selfie 1

As a generation, we are the pioneers of the selfie as a means of expression. Meaning: there are those who have already embraced the selfie and harness it (e.g. teens and celebrities). Then there are those who are still testing the waters, and in the process, delaying the selfie from fully becoming a normalised aspect of our culture.

●●

A selfie shared online is simply a process of bringing to the forefront what was once done in the background.

Basically, what the selfie is doing, is unleashing our obsession with self-portraits; it has made what was once invisible, visible across the entire internet universe.

In fact, selfies have always existed, albeit in a different format.

●●

Frida Kahlo was a Mexican painter, best known for her self-portraits.

Through a set of brushes and a vibrant palette, Kahlo depicted how she perceives herself to be, on an external level. In today’s vocabulary, she painted her selfie.

Frida Kahlo Self-Portrait

Painting is nowadays often perceived as time-consuming and expensive. In this regard, the smartphone has democratised the art of self-portraiture to the extent that selfies are taken, modified and shared instantaneously at no cost, whatsoever.

But if we could take pictures of anything, why are we so interested in our faces?

Our face is the organ that distinguishes us from other persons and is crucial for our identity. By flipping the lens and entering into the frame, we come to communicate deep ideas about who we are and where we fit into the world.

One of my favourite, and probably Frida Kahlo’s most famous quotes reads: “I paint myself because I am so often alone, and because I am the subject I know best.”

The selfie is a phenomenon in which the photographer is also the subject of the photograph – just like the self-portrait, but through a different medium.

What is perhaps most gauging about the selfie is the fact that we are given control over how we are seen by the world – definitely lacking in the filter-less photo-booth that had my first selfies taken, ten years ago.

Are smartphones redefining our sense of ‘home’ and belonging?

Whether it’s for work, study or leisure, it has become fairly common for twenty-somethings to pack their suitcases and jet off to travel the world.

What could have once been a unique experience in terms of travelling alone and to experience one’s culture anew, has now become somewhat of a lived dichotomy between being home and away through the marked use of technology.

In the pre-networked days, to travel alone meant leaving your whole world behind you to teeter into unknown cultural terrains. The only news from home would be through snail mail or the monthly (expensive) phone call.

Nowadays, the Internet holds our world together in a network infrastructure, and wireless Internet devices, make our networks portable. What’s more is that online communication (such as e-mail or Skype) is free and instant, championing both constraints of these classic communication methods. Therefore, tethered, we carry a sense of ‘home’ with us, through our mobile Internet devices.

During my solo travels in Asia and continental Europe, the smartphone was my Swiss Army knife of sociality since it carried my physically scattered social networks intact. It offered an instant portal to people, news and memes that kept me up to date with the rhythm of life in Malta. As heavenly as it might read on paper, in practice, it proved to create somewhat of an inner-conflict.

In a sense, I was in-between worlds, because my best-friends weren’t necessarily in the city I was nor in Malta – but on the Internet.

For instance: while I rattled my bicycle to and from the library, in a quaint cobble-stoned city in the Netherlands, one of my best-friends attended pub-quizzes behind the York Minster after lectures, while another boiled haggis for occasional Sunday lunches in Glasgow. The three of us Maltese ventured alone, yet social networking apps such as Facebook messenger or WhatsApp allowed us to remain pretty much together.

Irrespective of where our loved ones are, the idea of here and there is somehow shattered through this newly acquired networked intimacy. The phone has facilitated communication with all our friends, irrespective of where they are, altering our perception of time and space; it has come to represent a ‘mobile home’.

My German friend Saba once told me, “I moved from Germany four years ago, I went to Botswana, I went to Luxemburg, to France. I always took my friends with me, through my smartphone. That’s how I felt. Now I can talk to my friends instantly through my phone.”

Like Saba, my friends travelled with me from the Philippines, to Italy, to Belgium and to the Netherlands thanks to the Internet, and more intimately via Skype.

Video-conferencing (like Skype or FaceTime), is a fairly new and very common means of maintaining close contact with those that matter most. The quality of the call makes up for physical meetings, when these are not possible. While living in the Netherlands, my Polish housemate used to Skype with his mother in Warsaw almost every evening, “I feel that we are near each other during the conversation,” he used to tell me.

Our brains seem to record so-called ‘real’ and ‘virtual’ events so similarly that modern technologies conspire to blur these realms as well. As a matter of fact, we code face-to-face and online experiences similarly, often with equal realness. One may notice this in everyday language, when we speak of online encounters as if they were real: How is Sarah doing? Fine, I guess. I spoke to her on WhatsApp. Did you meet her new boyfriend? Yes, I saw them together on Facebook.

The sense of visual immediacy experienced via video-conferencing and modern social networking creates a simulation of presence and intimacy. Such that, even when people are physically distant, social networks could act as a connective tissue, coordinating and synchronising conversations with friends who are scattered across the world that would otherwise dissolve into silence.

Nonetheless, these mediated communication platforms do not merely substitute face-to-face interaction, but constitute a new kind of presence.

The Internet and smartphone could be used to either enhance a sense of belonging to the place where one is physically present, or it could alienate the individual from fully experiencing the actual place, culture and surroundings.

From my experience, technology compensatesfor rarity of physical encounters, but doesn’t replace them. Even though the Internet eliminates feelings of distance, the sense of presence and level of intimacy is only short-lived. At the end of the day, we all need to live certain aspects of our lives together with the people that we love most, and cannot be replaced through a screen.

Before the emergence of online social networking, communities were formed around a fixed geographical space and therefore led to a tangible concept of what it means to belong and feel at home within a given space.

Now the Internet beckons us to ‘come together’ across a medium, suggesting that we can feel and experience home, and belong somewhere that is not necessarily the same place we are physically bound to.

Living in a network society, it has become easier for me to define home in terms of people who are scattered, than a physical town or city. To the upcoming generation, our sense of belonging need not necessarily be tied down to residential geography but a new, emotional geography.

As seen in the December 2013 issue of The Sunday Circle, Malta’s leading culture and lifestyle magazine.

Unplugged: Why do people refuse to connect?

There is a homely Italian hang-out outside the Tal-Qroqq campus, which sits around fifteen people on its black and white plastic tables outside, and maybe another thirty on its two floors inside. At lunch hour on your average Wednesday, the place is full-up. Students, scholars and staff have all gathered to refuel, gossip, network, and surf the net.

To my left, two male students are swiping through photos on their iPads, the girl in the corner is discreetly reading something off her laptop, while the couple behind me is snapping selfies on their smartphone.

I’m guilty as charged – my fingers are typing away at a laptop, while I listen to The Killers on my iPod and attending to e-mails on my smartphone.

Despite the seemingly unstoppable tide of wireless devices that is sweeping our planet, it may seem surprising that there are still people out there who have never used the internet. Today, to connect means to be online. Yet, in the EU, 33 percent of citizens do not have internet access at home, and 29 percent claim they never access the internet. While in another survey, 15 percent of Americans do not use the internet at all.

Who are these black sheep, and why are they not flocking online?

At the moment, we are accepting a worldview wherein adoption of new technology is the norm. Science and technology scholar, Sally Wyatt wrote, “the use of information and communication technology (or any other technology) by individuals, organizations, and nations is taken as the norm, and non-use is perceived as a sign of deficiency to be remedied, or as a need to be fulfilled.”

As figures show, the majority have quickly come to adopt and adapt the internet to their everyday lives, demonstrating that the internet is no longer a luxury, but a given.

Just as users take an active role in shaping a certain technology through its use, non-users also contribute to the configuration of technologies across society and culture. Wyatt explains that it is important to get to know who these non-users are, and more importantly, why they opt not to conform to a rising culture of connectivity.

One would assume that the problem for these citizens is access, therefore making the internet cheaper, and providing education and training would be among the obvious solutions to reduce the amount of digital virgins.

This year, the EU has successfully achieved full broadband access across the entire continent, as part of the European Commission’s Digital Agenda, to make ‘every European digital.’

But, enhancing access is also based on the assumption that internet non-use is a problem to be solved, and once these barriers are overcome, people will embrace the technology with arms wide open.

The way that technology is adopted into our everyday life depends highly on the demographic and psychological characteristics of its users. Like users, a non-user’s age, gender, education and income also plays a role in determining motivations for non-use. When asked about their main motivations, non-users gave a variety of answers for shying away from the internet.

In the recently published Pew survey (US), 34 percent of participants do not use the internet because they feel it is not pertinent to their lifestyles. They claim to be disinterested and do not want to make use of the said technology. Others mention a concern about privacy (virus, hackers, spam) or that it was frustrating or difficult to use. As least, those who are offline are aware of the value of the internet: 44 percent of these offline adults said they have asked a friend of family member to look something up, or complete a task on the internet for them.

Interestingly, even those who do not have a computer nor plan to use the internet in the near future express a belief that computer skills are becoming a necessity – even if they could not articulate activities for which they could potentially use the computer.

Age is a major factor of internet usage and unsurprisingly, these people tend to stem from an older generation. 44 percent of offline Americans are older than 65, while only 2 percent are between 18 and 29 years old.

Moreover, those with lower incomes or lesser education are also more likely to be offline, as well as those whose future goals are less clear than those of adopters. From the 34 percent of offline Americans, there are those who are constrained by financial reasons (19 percent) or lack of physical availability or access to the internet (7 percent), which could even mean illiteracy.

Wyatt draws a distinction between non-users, that is, those who do not have access to a respective technology, and the want-nots, those who consciously resist or reject a technology. She explains that it is the latter group which, if paid sufficient attention to, can help in diversifying and enhancing technology.

Truth be told, I formed part of the want-nots until earlier this year, for I had refused to venture into the smartphone world. For a number of years, I went by with using a Nokia phone whose most exotic feature was a torch light. As long as it fulfilled my basic need to send and receive texts, I was happy.

However, in the restraining eye of society, I was excluding myself from the 60 percent of 16 to 24 year olds across the EU, who accessed the internet on the move. I was a black sheep – a non-adopter of new technology.

There was a personal choice that separated me from the rest of my fellow contemporaries. To be frank, I can’t say that I wasn’t intrigued by smartphones, but I had my doubts. Apart from being expensive, I thought a smartphone would be intrusive and I very much appreciated the notion that when I’m out of the house, I’m completely disconnected from the internet.

Usually, diffusion of new technologies and behaviours across society occur through a process of modeling and social influence. I was part of the diffusion, but as a spectator – a rebel of wireless internet technology. I wasn’t ready to take the plunge and have my ‘life changed’ in such a short period of time. I was protesting against the idea of being constantly connected, and at times, I romanticized over the beauty of letter-writing and instead of falling victim to the future, I fell into the trap of nostalgia. Alvin Toffler would diagnose my behavior as a symptom of future shock.

During the past five years, wireless technology use has diffused across society becoming a ubiquitous symbol of today’s culture. In this day and age, a high-speed internet connection is not merely restricted to the haven of our homes, or the conditioned air in our offices. Public spaces have also come to embrace wireless technology access. In fact, a lot of cafés, recreation centres, and village squares offer free Wi-Fi and accessible power sockets to change our devices, encouraging people to pull out their devices and stay connected, whenever, wherever.

Everywhere I went and whoever I was with, I was followed by this unspoken pressure to conform. Evolutionary psychology repeatedly shows how our basic human motive is to connect, and this is what eventually drove me into buying a smartphone: the basic need to connect.

I was getting tired of the resistance, which slowly made me feel like a grandma living in a twenty-something’s body. I had to adopt to a world with smartphones by getting one too. Having a smartphone made me feel connected, part of something. As superficial as it may sound, I belonged.

What is interesting is that when, as part of a personal research I conducted, I asked my friends why they had decided to buy a smartphones, I was met with ambivalent answers. Essentially, they explained that they don’t feel the need to own a smartphone and would willingly give it up, however, it makes it much easier for their friends to reach them.

So there: it’s not that they really needed their smartphones, it’s the tide in the wake of the culture of connectivity that’s swept them in.

After his year of self-imposed exile of the internet, The Verge journalist, Paul Miller came to realize just how easier the internet makes it to feel a relevant part of society. Without the internet, he fell ‘out of sync with the flow of life.’ “The internet isn’t an individual pursuit,” he writes, “it’s something we do with each other. The internet is where people are.”

Even though back in Tal-Qroqq, we were all sitting in the same café, only to ignore each other, we were all connecting, through the internet.

When in the mid-1980s, Joshua Meyrowitz wrote “to be out of touch in today’s world, is to be abnormal” the smartphone was still a product of science-fiction. Today, it is but a mainstream commodity becoming the most rapidly adopted technology in human history all for but a main reason: the internet.

In 2013, the internet is all embracing. It’s unavoidable. Everything is the internet: we are the internet. Manuel Castells said that it is difficult to go back to a pre-networked society, just as we cannot go back to a world without electricity.

Connectivity is no longer something abstract, it has fashioned itself into a state of mind. Now we are tethered to the rest of the world through the internet enclosed in a pocket-sized device. For better or for worse, the internet has changed our lives forever. Despite the digital divide, in the future, everyone will be online.

And in my earphones, The Killers front-man echoes the words: “This is the world that we live in/no we can’t go back.”

Originally published in the  The Sunday Times of Malta on November 10, 2013. 

Does foodtography ruin our appetite?

Over the past two years, my social media feeds have more or less evolved into a culinary still-life expo. We’ve gone from Facebook to Recipebook. But in truth, why are we meticulously documenting our culinary adventures and sharing them with a virtual public?

A leisurely scroll along what was once a cacophony of people’s concerns and whereabouts has suddenly become more visual – and it’s not merely selfies, but also what people are eating. Because let’s face it, even Nanna’s lampuki pie deserves to have its online moment.

Foodtography is the relatively recent trend of taking pictures of food and sharing them online via social media platforms such as instagram, Facebook, Twitter and Pinterest. What I find particularly interesting about this phenomenon is that the photos generally feature food, sans people.

A quick browse through my childhood photo albums shows pictures of people seated at long tables during summer barbecues and anniversary fenkati. But whatever the occasion, the main actor was not food – the focus was on the people and the eating experience as a whole.

On the other hand, with foodtography, the food has become the subject of the photograph, with most photos excluding the diner. Social media does what food does best – it brings people together.

This concept is pertinent in marketing and advertising strategies. Take Foodspotting for instance – this app, integrated with a map of restaurants close to your current location, showcases dishes that people have eaten. The app tagline – find and share great dishes, not just restaurants – encourages diners to shoot, tag and rate dishes under the #foodspotting hashtag. Then restaurants can promote their food while enticing new clients, for free.

Unless, of course, the people you are eating with believe that taking pictures of your food spoil the atmosphere of the meal. Food alone is a basic need for nourishment and survival, but eating together is deeply rooted in human culture. People who gather around a table are present to share more than just a meal, but also a conversation. People come together for special occasions and construct collective memories and experiences over food.

Professor Signe Rousseau from Cape Town University, South Africa, believes that: “Most of us love to eat, and we also love to tell stories through food. We all know that a picture is worth a thousand words, and as communication is becoming increasingly visual, we rely on others to make sense and interpret the food we share.”

Perhaps this is what we are trying to emulate through foodtography – a sense of virtual togetherness.

Self-proclaimed foodie Kim Davidson from Brooklyn, the US, recently ventured into people’s motivations behind foodtography. A former avid foodtographer herself, she explains that: “By combining photography with our storytelling capability, we are able to easily build discourses, especially for those who cannot partake in the meal with us.”

Photos capture special moments, thus providing information to those who aren’t present. Moreover, sharing such moments with an online audience enables people to engage in a discourse where personal memories are cued by photographs. “People’s relationship with food does not only satisfy our biological needs,” she continues. “It is also a profoundly social urge.”

Based on the ethos that sharing is caring, the internet and social media have created a virtual platform for foodie communities to gather and exchange their love and appreciation of good food. “Social media and food have one unique and seemingly genuine commonality, that of integrating people,” Davidson says. Indeed, social media does what food does best – it brings people together.

In this way, foodtography could also be perceived as a means of attracting people to one’s profile, increasing the chances of interaction via likes and comments, and thus satisfying one’s need for recognition. Additionally, the saying “you are what you eat” could also sustain the claim that foodtography could be linked to the online shaping of our identity.

Recently, a group of researchers from Brigham Young University, Utah, the US, found that an obsession with foodtography could be spoiling our appetite. They claim that looking at too many photos of food can make our eating less enjoyable due to sensory boredom.

As far-fetched as this may seem, there might actually be a grain of truth here. After a whole morning shooting irresistible dishes for a restaurant’s new menu, a food photographer friend of mine told me: “I didn’t eat anything for lunch. It felt like my body had already digested the food.”

Pictures are a representation of our environment – they have the ability to evoke emotions and may thus seem to reproduce reality. In this way, when we trawl through foodies’ profiles, our bodies could be fooled into experiencing the food as if it were present in front of us. If you pay close attention, you might realise that you start to salivate as a result of our body’s physiological reaction.

By the end of 2010, 80 billion photos were published on social media platforms – that goes to explain how nowadays, a lot of people don’t just write about what they’re up to – smartphones have facilitated visual communication, such that people also share photos of what they think, do and eat.

Foodtography has also facilitated the exchange of recipe ideas and created a whole new realm for advertisers. Moreover, food diaries may also eliminate the sense of loneliness one may feel when eating alone. However, we must remember to enjoy the company of others during a meal, since taking photos of food can alter the atmosphere when actually eating together.

This article was originally published in The Times of Malta, October 23, 2013.

A Cyberspace Odyssey

I’m sure that last October 14, most of you were sitting in the eight-million-strong audience watching Felix Baum­gartner’s record-breaking jump live on YouTube. The 43-year-old Austrian skydiver float­ed for two hours in a pur­pose-built capsule towed by a helium balloon before taking a giant leap from 128,000 feet.

I watched him for an hour before the jump, on a 15-inch laptop screen, while sitting with three of my housemates in our kitchen while nibbling freshly baked apple cake.

“It’s just like a movie,” exclaimed Tobi. And it’s true, because the idea of watching a 21st-century daredevil fall through the clouds seemed more like science-fiction than reality.

My adrenals were pumping as though I was there, perched on his shoulder. Physically, I was in one place, but at the same time, I was totally immersed in another.

This dichotomy is experienced even more vividly when we’re online. So if we’re increasingly experiencing our world through various media, does it mean that life is becoming more virtual?

The history of media and technology has been driven by our quest for immediacy. The internet has refashioned and extended upon earlier media – mail became e-mail, telephone conversations turned into Skype calls, television and radio became YouTube and Spotify, and our printed photos became albums stored in the cloud.

Spanish sociologist Manuel Castells states that we are moving from virtual realities to real virtualities. He writes how, “we are not just on the screen through which experience is communicated, but we become the experience”.

The advent of new media has brought about a convergence of different dimensions of communication spanning the globe, which are blurring the boundaries between the real and the virtual realm. As a result of this, we Instagram our food to make our friends’ mouths water at lunch.

And we no longer have to wait for the evening news – social media and live blogs posted by citizen journalists take us directly on location, when and where the news is happening, in real time.

We also call our friends and relatives abroad for free and watch them as they speak to us. As a result, our experience of reality – the here and now – is affected on a sensory as well as on an experiential level. If we spend whole days on Skype with relatives abroad, this can alienate us from living the full reality of our physical surroundings, in turn making the physical world more virtual and the digital more real.

While watching Baumgartner hovering through the air, the conversation with my housemates turned to the concept of the internet. I asked my housemates whether they consider the internet to be real or virtual, to which I was met with quizzical looks as they pondered on answers which none of us had.

For most of us, any notion of how all this information arrives in our homes and workplaces is weirdly immaterial. The world behind the technology of the internet is something many of us fail to think about. It is taken for granted – we just don’t think about the why and the how. It just is.

Normally, we think about things when something goes wrong, like realising that your car runs on four wheels when you get a puncture.

In fact, journalist Andrew Blum started wondering what the internet was really made of when a squirrel chewed through a cable and knocked him offline. His recent publication, Tubes: A Journey to the Centre of the Internet (Ecco) is an account of his two-year quest to uncover the physical world on which our digital lives are built.

During a TED Conference in Edinburgh last June, Blum recounted how over the past decade, his relationship with the physical world and his surroundings has changed. He started realising how he is spending less time out in the world and more time sitting in front of his computer screen. He also observed how our attention is constantly divided between real and virtual, both from looking inside our screens and outside the world around us. What was striking to Blum was that the world inside the screen seemed to have no physical reality of its own – it was cyberspace.

Cyberspace is a transcendent idea that has changed everything from shopping to dating. We cannot spatially locate cyberspace or perceive it as a tangible object, yet it is still real in its effects. The word was first used in the mid-1980s by American-Canadian science fiction writer William Gibson and it is nowadays used as a metaphor for the internet to give ourselves a sense of space and orientation.

Our brains are driven by meaning, and when something is too abstract to comprehend, we need to ground it in concrete terms. Cyberspace has a strange physicality, a place where people, albeit disembodied, meet and exchange information. Thus, we can say that cyberspace is an electronic landscape incorporating two worlds – the sensorial world of organically human and the digitised immaterial world.

Blum further explains how to him, the world beyond the screen is a kind of Milky Way – we are so small when compared to the galaxy that we cannot grasp it in its totality. Moreover, by spending hours online, we are easily immersed in this parallel universe we call cyberspace and more often than not ignore our physical presence.

Blum uncovered much of the mystery behind cyberspace by visiting the physical places that make the internet a living reality, such as 60, Hudson Street, New York where, he says, the giant networks of the internet are housed. Blum also travelled to Portugal where he saw the undersea cables connecting Europe and America being fixed. “My search for the internet has therefore been a search for reality,” he writes in his book.

After Baumgartner landed safely, my housemates and I started to wrap up our discussion. Sybil’s initial reaction had been that, no, the internet is obviously not real. But then, her ideas became more nuanced once Ján suggested it is real because he could touch the servers.

It’s true that we have grown into such a hyper-mediated world that we do not even realise the inherent paradox between real and virtual. By the end of it, we decided that the world beyond our screens seems to be both real and virtual because the exchanges of information and experiences made online happen between humans.

As Blum puts it, “A journey is really understood upon arriving home. What I understood (is that) the internet wasn’t a physical world or a virtual world, but a human world.”

This article was originally published by The Sunday Times of Malta, Sunday January 13, 2013. The featured image belongs to The Internet Mapping Project.