Having had a couple of odd (sometimes, simply bad) experiences with providing feedback to recent graduates, I set myself to the task of analysing how this could have happened. I'm pretty sure everyone wants to learn and feedback plays a key, yet often uncomfortable (on the receiving end) role. The way I've learned to provide positive and negative feedback (and have received it from the older generation) seems not very effective anymore.
Even with 'the gloves on', I noticed that any negative feedback became a great source of concern and insecurity for the persons I was talking to. They started doubting whether they were good enough for the job or whether they were in the right place at all. When you consistently receive negative feedback, I can understand that these thoughts may start to cross your mind. But after one session?
Discussing the issue with friends and family, some themes started to emerge. First of all, the content of the message was not the issue. The feedback itself had its merits. Secondly, the recipients seemed to be caught off-guard and not knowing what to do with the situation. As if this was the first time they encountered a somewhat difficult one-to-one. This triggered some further thinking.
Could the way we interact personally with each other be so different between the generations that it actually triggers all kinds of unintended side-effects? Current graduates use far more tools such as social media and direct-messaging apps that do not require real-life personal interactions; seeing a facial expression combined with the words makes the message completely different than just read from a computer screen (or listened to 😉). I'm not saying this is necessarily wrong, I just observe that this may make life quite challenging. After all, older people tend to be the teachers of the younger ones.
Still, I feel there is room for some rebalance. The finesses of personal interaction play a vital role in our society and need to be practised. How often does a child actually go to somebody to ask a question, let alone make a simple phone call (still not the facial expression, but at least some tone to the words)? Practice makes perfect, but if you do not get to practice that often anymore, what happens?
I also believe this topic is connected to our earlier observation about the lack of focus on resilience - our capacity to recover swiftly from adversity (see the obvious link?).
Processing all this, I keep on coming back to the conclusion that more personal interaction is the solution. What would this interaction in today's world look like? Is it through social media after all and do I need to finally make that jump (I'm extremely reluctant to do that) or can we find alternatives? We need the intimacy, one way or the other. Or both.
After seeing the documentary 'The Social Dilemma' and writing about it way back in Q&A 003, I half-joked to my two sons once about them getting a mobile phone when they would turn 18. A few weeks back, I overheard them repeating this exact statement to one of their friends, leading to a heartfelt 'really?' from their pals. Now that they are almost turning 9 years old, mobile phones are apparently becoming an issue, which means we have to start defining a strategy.
Fully realising the addictive nature of modern smartphones and the suite of applications inhabiting it, I contemplated our responsibility as parents in relation to potentially harmful substances and experiences in general. What makes us protect our kids from certain elements up to a certain age?
My working hypothesis right now, is that we seek to shield our kids from anything they are not yet able to handle themselves. Be it physically or emotionally, we try to estimate the moment when certain experiences might still be challenging, but not harmful anymore. This hypothesis made me think of the classic "comfort-stretch-stress" distinction, in which alternating between comfort and stretch will provide you with growth and learning, while stress can cause adverse effects. For your kids, you aim for learning and growth, not trauma.
I ended up concluding a 0/1 kind of decision is probably not the best solution. Analogous to alcohol, being allowed to switch from soda to coma-drinking at age 18 has its downsides. Adding extra dimensions to this puzzle could help. Usage limits, time limits, app limits, expanding freedom slowly might be the approach we're going for.
Discussing this subject did bring up several reasons for humility. Even as adults, we are sometimes not fully in control of our own mobile phone use. Checking for messages, inability to let a badge stay unchecked, doomscrolling on Twitter or YouTube. The strategies we will employ for our kids might warrant some introspection as well. We will keep you posted.
Dutch newspaper De Volkskrant recently published an article on how the public can be taught to deal with misinformation, popularly termed 'fake news'. The article is only accessible by paid subscription unfortunately. The content is based on research by the University of Cambridge that was reported on in 2021 and 2017, in the context of misinformation about Covid-19 and climate change, respectively.
The process proposed to build up 'resistance' to misinformation is ingenious in its simplicity: start with a warning ("you're being manipulated"), follow up by explaining the techniques used and present an example. In addition, after some time, you may be served with a booster (e.g. another example). The result is that you're better prepared in discerning misinformation from real information. This process is based on how vaccination works; your body get acquainted with the potential invaders en hence builds up defensive mechanisms.
The different articles describe what the ideal cocktail could look like. Misinformation can usually be recognised by one (or more) of the following communication techniques:
Misinformation is potentially a big problem as it is very personal and difficult to judge objectively. Most information that speaks to your mind and confirms built-in beliefs (and prejudices) tends to be very sticky. It's the confirmation bias doing its job. When that kind of information is repeatedly being served, it's no wonder that that becomes your truth. Having said that, as there is probably is no one overall truth, at least philosophically speaking, there is also no fake news...
A good way to start is to be wary of anyone claiming something to be either fake news or the truth. Figure out why that person or institution has a need to make such claims. Combining this analysis with your own 'prepared' mindset, gives you a well-rounded, own opinion.
More research is required, but I personally see this way of preparing the public for misinformation not necessarily as a vaccination campaign. I'd rather speak of educating us to go back to basics and be really critical of what we see, read and consume. Being able and comfortable to think for yourself is a critical factor in making our world less black-and-white, more colourful and in balance.
When Tim Ferriss started his podcast back in 2014, I became one of its most faithful listeners (and promotors, as those who know me probably can remember...). I often spent commutes listening to one of his (often 2+ hours) episodes. I religiously devoured even the episodes with guests that did not immediately attract me, since even those touched on interesting subjects I knew nothing about. Somewhere around the 220 episode mark, I lost my streak. Episodes started 'piling up' and I felt I was falling short of 'keeping up'. Fast forward 7 years and Tim is now at episode 536. I listen to an episode every once in a while, but feel perfectly comfortable with missing out on tons of possibly wonderful material.
Letting go of the notion that all my selected sources needed to be consumed in full did feel awkward at first, and I still struggle sometimes. A growing pile of old Economist magazines near my tea table is living proof. These days, I feel my consumption of sources comes and goes in waves. I tend to feel ok with that notion, knowing that the ebb and flow of my curiosity is a steadfast phenomenon. I want to bite off what I can really chew on. Only when I've noticed I did not consume anything for longer periods of time, some introspection is warranted. Often this means there is not enough space and downtime.
The same goes for this newsletter. Most of the articles we write are meant to be consumed for introspection purposes. They're not the daily news, it's an invitation to take its value. If you feel like you're not able to chew, it's fine to let go and not feel bad about it. Hit unsubscribe and make room for other joyful activity. We won't be disappointed. Promised!
Dutch newspaper NRC featured an article recently on social media and its effects. As a parent of high-school-age kids, the topic is of high interest to me. After having seen 'The Social Dilemma' about which we wrote earlier, social media features high on my 'worry-about-not-having-a-satisfactory-answer-yet'-list.
One of the items mentioned is that our attention span is decreasing. People seem to have more difficulty to finish a book, hold long conversations, work focused without interruptions. It is, however, difficult to prove without-a-doubt that the decreasing attention span is directly correlated to increasing use of social media. Mostly, this is due to the fact that there is no trustworthy data about attention span in the past.
The NRC article mentions writer Johann Hari, who wrote Stolen Focus, in which he claims that our attention has been taken from us by external forces. He bases this conclusion on conducting more than 200 interviews worldwide and analysing his own decreasing attention span. Hari describes 12 societal developments that have negatively impacted our ability to concentrate. On a positive note, he also offers (practical) solutions to get back in control.
Whatever the arguments, current effects and what's to blame, we're likely to all agree to the benefits of having some good old focus to solve big problems, nurture creativity to the max and live a happy life.
To me, it all boils down to the notion that we have choices. We may feel pressured to conform to certain standards; even governmental institutions communicate through Facebook or Instagram. The alternatives are often not known. This takes time and education, ánd courage; after all, you'll not follow the main crowd. I'm pretty confident that human beings are perfectly capable to remain true to themselves and make their own choices. Let's just simply repeat over and over again that you have a choice, plenty of them!
In public, almost everyone will tell you they despise it. At the same time, everyone will probably have to admit they have done it. Gossiping. Assistant professor of history Christopher Elias wrote an interesting article about the phenomenon. Not surprisingly, gossiping can be damaging, but it also seems to serve an important social function.
"Gossiping has historically served an essential social function by fostering interpersonal connections. At its core, gossiping is an intimate act, requiring the discussion of sensitive topics in enclosed spaces."
Interestingly, the word ‘gossip’ stems from 'godsibb', a term for the attendants at a child’s baptism, in the 11th century. By the 16th century, 'godsibb' had become 'gossip', a noun used to refer to the ‘close female friends whom a woman invited to attend her at childbirth’. Men, who were not allowed to attend, apparently came to fear what was being said in their absence. Mr. Elias refers to the evolutionary psychologist Robin Dunbar, who argues that gossip evolved over time to allow individuals to exchange important information, keep track of their allies and competitors, and most importantly, to intensify social ties in the growing, complex societies.
Research does however seem to indicate that physical presence is important while gossiping. The real-life, interpersonal connection is somehow important to be able to really judge what is being said and form an opinion. Current social media platforms allow for a certain disconnectedness to that important set of filters and enables increased superficiality and voyeurism.
"By granting me distance from those people, social media allowed me to embrace the worst elements of gossip; I enjoyed the illusion of access to this woman’s thoughts and intentions, but no real connection to her."
Let's not judge too soon. Good gossip does exist and I (want to) believe that we'll be able to find ways to do this in an online environment as well. Understanding and integrating social media into our daily lives is a challenge. Curious to learn more and how others are dealing with this.
While I chose to get vaccinated against COVID-19 a few weeks ago, I am still interested in understanding how well the vaccines are performing. Considering the enormous stakes at hand, I think it is wise to keep a critical eye. The other day I decided to take two often-shared tweets doubting vaccine efficacy and see what proof they were referencing.
The first Tweet quoted a British analysis of recent COVID-19 variants of concern, sharing numbers for severe cases of the new Delta-variant. With red markers, numbers were circled of severe cases with people that had been fully vaccinated: the numbers were even a bit higher than the number of cases with unvaccinated people! Unfortunately, this does not prove that vaccines do not help. It just proves you've not been paying attention in statistics class, since the UK leads EU vaccination rates. In the vulnerable 65+ demographic, vaccination rates are well over 90%, meaning the 'non-vaccinated' cases should be multiplied by 9 to get a fair comparison.
Another Tweet copy-pasted a table from a peer reviewed study in renowned medical journal The Lancet. While most communication on vaccines tend to quote RRR (Relative Risk Reduction, 94-95% for Moderna & Pfizer), the study also showed ARR (Absolute Risk Reduction) being a seemingly much less impressive percentage (1,2 - 1,3%). Considering the absolute risk of getting seriously ill from COVID-19 is about 1,5-2,0%, these numbers are equally impressive as the RRR, but they're just a different representation. These lower percentages were then shared in the Tweet as 'the real truth about vaccine efficacy'.
My little case study made me experience how disinformation works in this day and age. Not a lot of people follow links for official looking information and have the mathematical prowess to draw their own conclusions. Even then, I'm not a epidemiologist, so some logic behind lockdowns and other measures is above my paygrade. The balance between trust and verification is affected by the lightning speed at which information travels these days. I'm pondering how those same advancements might help us restore a workable balance for our society.
The podcast market is experiencing enormous growth. According to 'the podcasthost', the number of different podcasts has grown at least 4x over the past 3 years. A podcast is attractive as it offers the ability to listen to your topic of choice when and where you want. It's basically fully customized to your own needs and a seriously good alternative to the repeating sequence of news items you normally encounter on a radio station.
I personally find myself using podcasts not only as relaxation time but also as time to learn and grow. I even discovered that programme duration is not an issue. I recently finished listening to a 6-leg series on the 2nd world war in the Pacific. Each episode lasted for at least 4-5 hours. The episodes were released over time while they were created; from the first episode to the last took more than 2.5 years. This did not bother me at all.
Anyone can create and post podcasts. With currently more than 3mln different podcasts, finding those that will entertain you is a challenge. Your interests and hobby's are a good starting point to find something suitable. On my personal list for instance are backgrounds to daily news (New York Times 'the Daily'), history (Dan Carlin's 'Hardcore history'), inspiration and optimism (Simon Sinek's 'a bit of optimism').
With the world opening up, there'll be plenty of time to listen to podcasts while commuting or on your way to friends and family. At home there are opportunities to listen as well. The lockdown period has learned me various new moments when you can put on your headphones; podcasts make great amusement whilst cooking, doing the dishes or the laundry. You may even start to look forward to doing those tasks!
I have a love-hate relationship with Social Media. The internet has the potential to be a great connector in our human fabric, but it can also create toxic atmospheres and echo chambers. It can be a huge distraction too. My latest weapon against distraction has been a thorough cleaning and separation of apps. As a rule, 'consumption' apps for media can now only live on my iPad, leaving my iPhone void of anything that lends itself to 'doomscrolling'.When I do engage in Social Media, I keep Nir Eyal's words in mind:
Distraction is only distraction if it really distracts you from something. Otherwise it's entertainment. The physical act of grabbing my iPad signals 'entertainment', making it a conscious choice instead of unconsciously sliding from 'checking my mail' to 'checking Twitter' on my iPhone.
Yesterday I played around with new social kid on the block 'Clubhouse'. It's an audio-based discussion forum, where you can follow people and subjects you're interested in. Anyone can listen in to a 'room' and the moderators can invite listeners to chip in (you can raise your hand), sometimes making for some interesting conversation. The audio focus makes for great consumption while walking, driving or doing chores.
Zooming out to the greater narrative, I love seeing these new apps as experiments on digital human interaction. Clubhouse is an experiment, like Facebook and Twitter continue as an experiment. If we don't like something we move on to an alternative (Signal, anyone?). The tools are evolving, and so is our use of them. The growing pains are merely an indication that something is growing.
We've written before about the potential use speech and text analysis. Here's a recent application. Analysis of English-language contributions on social-media platform Twitter, so-called 'tweets', shows that 2020 was for most people more miserable than other years. Not a surprising outcome, but still evidence that what we communicate reveals something about how we feel.
The graph below shows the analysis (courtesy of Marine Wagner who collects on a quarterly basis the -according to her- most interesting graphs and quotes). There is, of course, the benefit of hindsight and whether the dips are really associated with the explanations given cannot be proven. Still, the overall trend is clearly visible and English-speaking (writing) people used less happy words than before. The influence of Brexiteers might have lost out to people writing about the impact of the pandemic. 😀
On a positive note, I do see an upward trend in the dark-blue line. I wonder: if analysis of tweets can show the mood of people, would we also be able to find out what birds have to say and what the impact is of less air traffic on their happiness?
During a call with investors last week, Twitter CEO Jack Dorsey explained how he wants to start experimenting with people picking out their own favourite recommendation algorithm. That would mean that not Twitter's own, but your choice of algorithm would decide what turns up on your timeline. Dorsey even floated the idea of having a sort of marketplace, where people could offer and shop for algorithms.
A lot of the flack that Social Media companies receive, is tied to the mechanisms that decide what content you get to see. We earlier promoted Netflix 'The Social Dilemma' that explains in shocking detail how algorithms test and promote content with no other apparent goal than to keep you clicking down the rabbit-hole for as long as possible. Extended engagement is a metric for success, and leads to more possible advertising income.
I have not been a fan of every experiment Jack Dorsey has attempted (just check his beard). While this effort can also be explained as a possible hedge against Section 230 changes (the contested US law that gives platforms protection from user-generated content), I tend to like this direction. Even if we have to pay for less manipulative algorithms, we can start seeing how we pay either way; money, privacy, freedom. We'll have a new option that does not challenge our willpower every time we open Social Media or forces us to leave the scene altogether, and that's progress.
Long before the internet existed in its current form, Economist Herbert A. Simon said a wealth of information creates a poverty of attention. In the last decade, this scarce attention has become the main target for a lot of Silicon Valley success stories.
Last week I watched the new Netflix documentary The Social Dilemma on the impact of social media on daily life and our society. While I was not impressed with some of the acting and I did not learn a lot of new facts, watching the documentary did make a big impression on me. Hearing the ex-director of monetization at Facebook explain he keeps his kids completely off social media was a part of that.
While I deeply believe the internet can be a force for good, there is overwhelming evidence of the detrimental effects it is having on our society right now. Like all great technological advances, it can be both a terrible master and a terrific tool. Thankfully, the documentary closes with action-oriented advice and even a website dedicated to the things we can do ourselves. While regulation might be on its way, nothing beats taking matter into your own hands.