Questões de Vestibular Sobre inglês
Foram encontradas 5.992 questões
T E X T
I Used to Fear Being a Nobody. Then I Left
Social Media.
By Bianca Brooks
“What’s happening?”
I stare blankly at the little box as I try to think of something clever for my first tweet. I settle on what’s at the top of my mind: “My only #fear is being a nobody.” How could I know this exchange would begin a dialogue that would continue nearly every day for the next nine years of my life?
I began using Twitter in 2010 as a newly minted high school freshman. Though it began as a hub for my quirky adolescent thoughts, over the years it became an archive of my emotional and intellectual voice — a kind of virtual display for the evolution of my politics and artistic identity. But after nine years, it was time to close the archive. My wanting to share my every waking thought became eclipsed by a desire for an increasingly rare commodity — a private life.
Though I thought disappearing from social media would be as simple as logging off, my refusal to post anything caused a bit of a stir among my small but loyal following. I began to receive emails from strangers asking me where I had gone and when I would return. One message read: “Not to be over familiar, but you have to come back eventually. You’re a writer after all. How will we read your writing?” Another follower inquired, “Where will you go?”
The truth is I have not gone anywhere. I am, in fact, more present than ever
Over time, I have begun to sense these messages reveal more than a lack of respect for privacy. I realize that to many millennials, a life without a social media presence is not simply a private life; it is no life at all: We possess a widespread, genuine fear of obscurity.
When I consider the near-decade I have spent on social media, this worry makes sense. As with many in my generation, Twitter was my entry into conversations happening on a global scale; long before my byline graced any publication, tweeting was how I felt a part of the world. Twitter functions much like an echo chamber dependent on likes and retweets, and gaining notoriety is as easy as finding someone to agree with you. For years I poured my opinions, musings and outrage onto my timeline, believing I held an indispensable place in a vital sociopolitical experiment.
But these passionate, public observations were born of more than just a desire to speak my mind — I was measuring my individual worth in constant visibility. Implicit in my follower’s question “Where will you go?” is the resounding question “How will we know where you’ve gone?” Privacy is considered a small exchange for the security of being well known and well liked.
After all, a private life boasts no location markers or story updates. The idea that the happenings of our lives would be constrained to our immediate families, friends and real-life communities is akin to social death in a world measured by followers, views, likes and shares.
I grow weary when I think of this as the new normal for what is considered to be a fruitful personal life. Social media is no longer a mere public extension of our private socialization; it has become a replacement for it. What happens to our humanity when we relegate our real lives to props for the performance of our virtual ones?
For one, a predominantly online existence can lull us into a dubious sense of having enacted concrete change, simply because of a tweet or Instagram post. As “hashtag activism” has obscured longstanding traditions of assembly and protest, there’s concern that a failure to transition from the keyboard to in-person organization will effectively stall or kill the momentum of political movements. (See: Occupy Wall Street.)
The sanctity of our most intimate experiences is also diminished. My grandfather Charles Shaw — a notable musician whose wisdoms and jazz scene tales I often shared on Twitter — passed away last year. Rather than take adequate time to privately mourn the loss of his giant influence in my life alongside those who loved him most, I quickly posted a lengthy tribute to him to my followers. At the time I thought, “How will they remember him if I don’t acknowledge his passing?”
Perhaps at the root of this anxiety over being forgotten is an urgent question of how one ought to form a legacy; with the rise of automation, a widening wealth gap and an unstable political climate, it is easy to feel unimportant. It is almost as if the world is too big and we are much too small to excel in it in any meaningful way. We feel we need as many people as possible to witness our lives, so as not to be left out of a story that is being written too fast by people much more significant than ourselves.
“The secret of a full life is to live and relate to others as if they might not be there tomorrow, as if you might not be there tomorrow,” the writer Anais Nin said. “This feeling has become a rarity, and rarer every day now that we have reached a hastier and more superficial rhythm, now that we believe we are in touch with a greater amount of people. This is the illusion which might cheat us of being in touch deeply with the one breathing next to us.”
I think of those words and at once any fear of obscurity is eclipsed by much deeper ones — the fear of forgoing the sacred moments of life, of never learning to be completely alone, of not bearing witness to the incredible lives of those who surround me.
I observe the world around me. It is big and moving fast. “What’s happening?” I think to myself.
I’m just beginning to find out.
From:www.nytimes.com/Oct. 1, 2019
T E X T
I Used to Fear Being a Nobody. Then I Left
Social Media.
By Bianca Brooks
“What’s happening?”
I stare blankly at the little box as I try to think of something clever for my first tweet. I settle on what’s at the top of my mind: “My only #fear is being a nobody.” How could I know this exchange would begin a dialogue that would continue nearly every day for the next nine years of my life?
I began using Twitter in 2010 as a newly minted high school freshman. Though it began as a hub for my quirky adolescent thoughts, over the years it became an archive of my emotional and intellectual voice — a kind of virtual display for the evolution of my politics and artistic identity. But after nine years, it was time to close the archive. My wanting to share my every waking thought became eclipsed by a desire for an increasingly rare commodity — a private life.
Though I thought disappearing from social media would be as simple as logging off, my refusal to post anything caused a bit of a stir among my small but loyal following. I began to receive emails from strangers asking me where I had gone and when I would return. One message read: “Not to be over familiar, but you have to come back eventually. You’re a writer after all. How will we read your writing?” Another follower inquired, “Where will you go?”
The truth is I have not gone anywhere. I am, in fact, more present than ever
Over time, I have begun to sense these messages reveal more than a lack of respect for privacy. I realize that to many millennials, a life without a social media presence is not simply a private life; it is no life at all: We possess a widespread, genuine fear of obscurity.
When I consider the near-decade I have spent on social media, this worry makes sense. As with many in my generation, Twitter was my entry into conversations happening on a global scale; long before my byline graced any publication, tweeting was how I felt a part of the world. Twitter functions much like an echo chamber dependent on likes and retweets, and gaining notoriety is as easy as finding someone to agree with you. For years I poured my opinions, musings and outrage onto my timeline, believing I held an indispensable place in a vital sociopolitical experiment.
But these passionate, public observations were born of more than just a desire to speak my mind — I was measuring my individual worth in constant visibility. Implicit in my follower’s question “Where will you go?” is the resounding question “How will we know where you’ve gone?” Privacy is considered a small exchange for the security of being well known and well liked.
After all, a private life boasts no location markers or story updates. The idea that the happenings of our lives would be constrained to our immediate families, friends and real-life communities is akin to social death in a world measured by followers, views, likes and shares.
I grow weary when I think of this as the new normal for what is considered to be a fruitful personal life. Social media is no longer a mere public extension of our private socialization; it has become a replacement for it. What happens to our humanity when we relegate our real lives to props for the performance of our virtual ones?
For one, a predominantly online existence can lull us into a dubious sense of having enacted concrete change, simply because of a tweet or Instagram post. As “hashtag activism” has obscured longstanding traditions of assembly and protest, there’s concern that a failure to transition from the keyboard to in-person organization will effectively stall or kill the momentum of political movements. (See: Occupy Wall Street.)
The sanctity of our most intimate experiences is also diminished. My grandfather Charles Shaw — a notable musician whose wisdoms and jazz scene tales I often shared on Twitter — passed away last year. Rather than take adequate time to privately mourn the loss of his giant influence in my life alongside those who loved him most, I quickly posted a lengthy tribute to him to my followers. At the time I thought, “How will they remember him if I don’t acknowledge his passing?”
Perhaps at the root of this anxiety over being forgotten is an urgent question of how one ought to form a legacy; with the rise of automation, a widening wealth gap and an unstable political climate, it is easy to feel unimportant. It is almost as if the world is too big and we are much too small to excel in it in any meaningful way. We feel we need as many people as possible to witness our lives, so as not to be left out of a story that is being written too fast by people much more significant than ourselves.
“The secret of a full life is to live and relate to others as if they might not be there tomorrow, as if you might not be there tomorrow,” the writer Anais Nin said. “This feeling has become a rarity, and rarer every day now that we have reached a hastier and more superficial rhythm, now that we believe we are in touch with a greater amount of people. This is the illusion which might cheat us of being in touch deeply with the one breathing next to us.”
I think of those words and at once any fear of obscurity is eclipsed by much deeper ones — the fear of forgoing the sacred moments of life, of never learning to be completely alone, of not bearing witness to the incredible lives of those who surround me.
I observe the world around me. It is big and moving fast. “What’s happening?” I think to myself.
I’m just beginning to find out.
From:www.nytimes.com/Oct. 1, 2019
T E X T
I Used to Fear Being a Nobody. Then I Left
Social Media.
By Bianca Brooks
“What’s happening?”
I stare blankly at the little box as I try to think of something clever for my first tweet. I settle on what’s at the top of my mind: “My only #fear is being a nobody.” How could I know this exchange would begin a dialogue that would continue nearly every day for the next nine years of my life?
I began using Twitter in 2010 as a newly minted high school freshman. Though it began as a hub for my quirky adolescent thoughts, over the years it became an archive of my emotional and intellectual voice — a kind of virtual display for the evolution of my politics and artistic identity. But after nine years, it was time to close the archive. My wanting to share my every waking thought became eclipsed by a desire for an increasingly rare commodity — a private life.
Though I thought disappearing from social media would be as simple as logging off, my refusal to post anything caused a bit of a stir among my small but loyal following. I began to receive emails from strangers asking me where I had gone and when I would return. One message read: “Not to be over familiar, but you have to come back eventually. You’re a writer after all. How will we read your writing?” Another follower inquired, “Where will you go?”
The truth is I have not gone anywhere. I am, in fact, more present than ever
Over time, I have begun to sense these messages reveal more than a lack of respect for privacy. I realize that to many millennials, a life without a social media presence is not simply a private life; it is no life at all: We possess a widespread, genuine fear of obscurity.
When I consider the near-decade I have spent on social media, this worry makes sense. As with many in my generation, Twitter was my entry into conversations happening on a global scale; long before my byline graced any publication, tweeting was how I felt a part of the world. Twitter functions much like an echo chamber dependent on likes and retweets, and gaining notoriety is as easy as finding someone to agree with you. For years I poured my opinions, musings and outrage onto my timeline, believing I held an indispensable place in a vital sociopolitical experiment.
But these passionate, public observations were born of more than just a desire to speak my mind — I was measuring my individual worth in constant visibility. Implicit in my follower’s question “Where will you go?” is the resounding question “How will we know where you’ve gone?” Privacy is considered a small exchange for the security of being well known and well liked.
After all, a private life boasts no location markers or story updates. The idea that the happenings of our lives would be constrained to our immediate families, friends and real-life communities is akin to social death in a world measured by followers, views, likes and shares.
I grow weary when I think of this as the new normal for what is considered to be a fruitful personal life. Social media is no longer a mere public extension of our private socialization; it has become a replacement for it. What happens to our humanity when we relegate our real lives to props for the performance of our virtual ones?
For one, a predominantly online existence can lull us into a dubious sense of having enacted concrete change, simply because of a tweet or Instagram post. As “hashtag activism” has obscured longstanding traditions of assembly and protest, there’s concern that a failure to transition from the keyboard to in-person organization will effectively stall or kill the momentum of political movements. (See: Occupy Wall Street.)
The sanctity of our most intimate experiences is also diminished. My grandfather Charles Shaw — a notable musician whose wisdoms and jazz scene tales I often shared on Twitter — passed away last year. Rather than take adequate time to privately mourn the loss of his giant influence in my life alongside those who loved him most, I quickly posted a lengthy tribute to him to my followers. At the time I thought, “How will they remember him if I don’t acknowledge his passing?”
Perhaps at the root of this anxiety over being forgotten is an urgent question of how one ought to form a legacy; with the rise of automation, a widening wealth gap and an unstable political climate, it is easy to feel unimportant. It is almost as if the world is too big and we are much too small to excel in it in any meaningful way. We feel we need as many people as possible to witness our lives, so as not to be left out of a story that is being written too fast by people much more significant than ourselves.
“The secret of a full life is to live and relate to others as if they might not be there tomorrow, as if you might not be there tomorrow,” the writer Anais Nin said. “This feeling has become a rarity, and rarer every day now that we have reached a hastier and more superficial rhythm, now that we believe we are in touch with a greater amount of people. This is the illusion which might cheat us of being in touch deeply with the one breathing next to us.”
I think of those words and at once any fear of obscurity is eclipsed by much deeper ones — the fear of forgoing the sacred moments of life, of never learning to be completely alone, of not bearing witness to the incredible lives of those who surround me.
I observe the world around me. It is big and moving fast. “What’s happening?” I think to myself.
I’m just beginning to find out.
From:www.nytimes.com/Oct. 1, 2019
T E X T
I Used to Fear Being a Nobody. Then I Left
Social Media.
By Bianca Brooks
“What’s happening?”
I stare blankly at the little box as I try to think of something clever for my first tweet. I settle on what’s at the top of my mind: “My only #fear is being a nobody.” How could I know this exchange would begin a dialogue that would continue nearly every day for the next nine years of my life?
I began using Twitter in 2010 as a newly minted high school freshman. Though it began as a hub for my quirky adolescent thoughts, over the years it became an archive of my emotional and intellectual voice — a kind of virtual display for the evolution of my politics and artistic identity. But after nine years, it was time to close the archive. My wanting to share my every waking thought became eclipsed by a desire for an increasingly rare commodity — a private life.
Though I thought disappearing from social media would be as simple as logging off, my refusal to post anything caused a bit of a stir among my small but loyal following. I began to receive emails from strangers asking me where I had gone and when I would return. One message read: “Not to be over familiar, but you have to come back eventually. You’re a writer after all. How will we read your writing?” Another follower inquired, “Where will you go?”
The truth is I have not gone anywhere. I am, in fact, more present than ever
Over time, I have begun to sense these messages reveal more than a lack of respect for privacy. I realize that to many millennials, a life without a social media presence is not simply a private life; it is no life at all: We possess a widespread, genuine fear of obscurity.
When I consider the near-decade I have spent on social media, this worry makes sense. As with many in my generation, Twitter was my entry into conversations happening on a global scale; long before my byline graced any publication, tweeting was how I felt a part of the world. Twitter functions much like an echo chamber dependent on likes and retweets, and gaining notoriety is as easy as finding someone to agree with you. For years I poured my opinions, musings and outrage onto my timeline, believing I held an indispensable place in a vital sociopolitical experiment.
But these passionate, public observations were born of more than just a desire to speak my mind — I was measuring my individual worth in constant visibility. Implicit in my follower’s question “Where will you go?” is the resounding question “How will we know where you’ve gone?” Privacy is considered a small exchange for the security of being well known and well liked.
After all, a private life boasts no location markers or story updates. The idea that the happenings of our lives would be constrained to our immediate families, friends and real-life communities is akin to social death in a world measured by followers, views, likes and shares.
I grow weary when I think of this as the new normal for what is considered to be a fruitful personal life. Social media is no longer a mere public extension of our private socialization; it has become a replacement for it. What happens to our humanity when we relegate our real lives to props for the performance of our virtual ones?
For one, a predominantly online existence can lull us into a dubious sense of having enacted concrete change, simply because of a tweet or Instagram post. As “hashtag activism” has obscured longstanding traditions of assembly and protest, there’s concern that a failure to transition from the keyboard to in-person organization will effectively stall or kill the momentum of political movements. (See: Occupy Wall Street.)
The sanctity of our most intimate experiences is also diminished. My grandfather Charles Shaw — a notable musician whose wisdoms and jazz scene tales I often shared on Twitter — passed away last year. Rather than take adequate time to privately mourn the loss of his giant influence in my life alongside those who loved him most, I quickly posted a lengthy tribute to him to my followers. At the time I thought, “How will they remember him if I don’t acknowledge his passing?”
Perhaps at the root of this anxiety over being forgotten is an urgent question of how one ought to form a legacy; with the rise of automation, a widening wealth gap and an unstable political climate, it is easy to feel unimportant. It is almost as if the world is too big and we are much too small to excel in it in any meaningful way. We feel we need as many people as possible to witness our lives, so as not to be left out of a story that is being written too fast by people much more significant than ourselves.
“The secret of a full life is to live and relate to others as if they might not be there tomorrow, as if you might not be there tomorrow,” the writer Anais Nin said. “This feeling has become a rarity, and rarer every day now that we have reached a hastier and more superficial rhythm, now that we believe we are in touch with a greater amount of people. This is the illusion which might cheat us of being in touch deeply with the one breathing next to us.”
I think of those words and at once any fear of obscurity is eclipsed by much deeper ones — the fear of forgoing the sacred moments of life, of never learning to be completely alone, of not bearing witness to the incredible lives of those who surround me.
I observe the world around me. It is big and moving fast. “What’s happening?” I think to myself.
I’m just beginning to find out.
From:www.nytimes.com/Oct. 1, 2019
T E X T
I Used to Fear Being a Nobody. Then I Left
Social Media.
By Bianca Brooks
“What’s happening?”
I stare blankly at the little box as I try to think of something clever for my first tweet. I settle on what’s at the top of my mind: “My only #fear is being a nobody.” How could I know this exchange would begin a dialogue that would continue nearly every day for the next nine years of my life?
I began using Twitter in 2010 as a newly minted high school freshman. Though it began as a hub for my quirky adolescent thoughts, over the years it became an archive of my emotional and intellectual voice — a kind of virtual display for the evolution of my politics and artistic identity. But after nine years, it was time to close the archive. My wanting to share my every waking thought became eclipsed by a desire for an increasingly rare commodity — a private life.
Though I thought disappearing from social media would be as simple as logging off, my refusal to post anything caused a bit of a stir among my small but loyal following. I began to receive emails from strangers asking me where I had gone and when I would return. One message read: “Not to be over familiar, but you have to come back eventually. You’re a writer after all. How will we read your writing?” Another follower inquired, “Where will you go?”
The truth is I have not gone anywhere. I am, in fact, more present than ever
Over time, I have begun to sense these messages reveal more than a lack of respect for privacy. I realize that to many millennials, a life without a social media presence is not simply a private life; it is no life at all: We possess a widespread, genuine fear of obscurity.
When I consider the near-decade I have spent on social media, this worry makes sense. As with many in my generation, Twitter was my entry into conversations happening on a global scale; long before my byline graced any publication, tweeting was how I felt a part of the world. Twitter functions much like an echo chamber dependent on likes and retweets, and gaining notoriety is as easy as finding someone to agree with you. For years I poured my opinions, musings and outrage onto my timeline, believing I held an indispensable place in a vital sociopolitical experiment.
But these passionate, public observations were born of more than just a desire to speak my mind — I was measuring my individual worth in constant visibility. Implicit in my follower’s question “Where will you go?” is the resounding question “How will we know where you’ve gone?” Privacy is considered a small exchange for the security of being well known and well liked.
After all, a private life boasts no location markers or story updates. The idea that the happenings of our lives would be constrained to our immediate families, friends and real-life communities is akin to social death in a world measured by followers, views, likes and shares.
I grow weary when I think of this as the new normal for what is considered to be a fruitful personal life. Social media is no longer a mere public extension of our private socialization; it has become a replacement for it. What happens to our humanity when we relegate our real lives to props for the performance of our virtual ones?
For one, a predominantly online existence can lull us into a dubious sense of having enacted concrete change, simply because of a tweet or Instagram post. As “hashtag activism” has obscured longstanding traditions of assembly and protest, there’s concern that a failure to transition from the keyboard to in-person organization will effectively stall or kill the momentum of political movements. (See: Occupy Wall Street.)
The sanctity of our most intimate experiences is also diminished. My grandfather Charles Shaw — a notable musician whose wisdoms and jazz scene tales I often shared on Twitter — passed away last year. Rather than take adequate time to privately mourn the loss of his giant influence in my life alongside those who loved him most, I quickly posted a lengthy tribute to him to my followers. At the time I thought, “How will they remember him if I don’t acknowledge his passing?”
Perhaps at the root of this anxiety over being forgotten is an urgent question of how one ought to form a legacy; with the rise of automation, a widening wealth gap and an unstable political climate, it is easy to feel unimportant. It is almost as if the world is too big and we are much too small to excel in it in any meaningful way. We feel we need as many people as possible to witness our lives, so as not to be left out of a story that is being written too fast by people much more significant than ourselves.
“The secret of a full life is to live and relate to others as if they might not be there tomorrow, as if you might not be there tomorrow,” the writer Anais Nin said. “This feeling has become a rarity, and rarer every day now that we have reached a hastier and more superficial rhythm, now that we believe we are in touch with a greater amount of people. This is the illusion which might cheat us of being in touch deeply with the one breathing next to us.”
I think of those words and at once any fear of obscurity is eclipsed by much deeper ones — the fear of forgoing the sacred moments of life, of never learning to be completely alone, of not bearing witness to the incredible lives of those who surround me.
I observe the world around me. It is big and moving fast. “What’s happening?” I think to myself.
I’m just beginning to find out.
From:www.nytimes.com/Oct. 1, 2019
T E X T
I Used to Fear Being a Nobody. Then I Left
Social Media.
By Bianca Brooks
“What’s happening?”
I stare blankly at the little box as I try to think of something clever for my first tweet. I settle on what’s at the top of my mind: “My only #fear is being a nobody.” How could I know this exchange would begin a dialogue that would continue nearly every day for the next nine years of my life?
I began using Twitter in 2010 as a newly minted high school freshman. Though it began as a hub for my quirky adolescent thoughts, over the years it became an archive of my emotional and intellectual voice — a kind of virtual display for the evolution of my politics and artistic identity. But after nine years, it was time to close the archive. My wanting to share my every waking thought became eclipsed by a desire for an increasingly rare commodity — a private life.
Though I thought disappearing from social media would be as simple as logging off, my refusal to post anything caused a bit of a stir among my small but loyal following. I began to receive emails from strangers asking me where I had gone and when I would return. One message read: “Not to be over familiar, but you have to come back eventually. You’re a writer after all. How will we read your writing?” Another follower inquired, “Where will you go?”
The truth is I have not gone anywhere. I am, in fact, more present than ever
Over time, I have begun to sense these messages reveal more than a lack of respect for privacy. I realize that to many millennials, a life without a social media presence is not simply a private life; it is no life at all: We possess a widespread, genuine fear of obscurity.
When I consider the near-decade I have spent on social media, this worry makes sense. As with many in my generation, Twitter was my entry into conversations happening on a global scale; long before my byline graced any publication, tweeting was how I felt a part of the world. Twitter functions much like an echo chamber dependent on likes and retweets, and gaining notoriety is as easy as finding someone to agree with you. For years I poured my opinions, musings and outrage onto my timeline, believing I held an indispensable place in a vital sociopolitical experiment.
But these passionate, public observations were born of more than just a desire to speak my mind — I was measuring my individual worth in constant visibility. Implicit in my follower’s question “Where will you go?” is the resounding question “How will we know where you’ve gone?” Privacy is considered a small exchange for the security of being well known and well liked.
After all, a private life boasts no location markers or story updates. The idea that the happenings of our lives would be constrained to our immediate families, friends and real-life communities is akin to social death in a world measured by followers, views, likes and shares.
I grow weary when I think of this as the new normal for what is considered to be a fruitful personal life. Social media is no longer a mere public extension of our private socialization; it has become a replacement for it. What happens to our humanity when we relegate our real lives to props for the performance of our virtual ones?
For one, a predominantly online existence can lull us into a dubious sense of having enacted concrete change, simply because of a tweet or Instagram post. As “hashtag activism” has obscured longstanding traditions of assembly and protest, there’s concern that a failure to transition from the keyboard to in-person organization will effectively stall or kill the momentum of political movements. (See: Occupy Wall Street.)
The sanctity of our most intimate experiences is also diminished. My grandfather Charles Shaw — a notable musician whose wisdoms and jazz scene tales I often shared on Twitter — passed away last year. Rather than take adequate time to privately mourn the loss of his giant influence in my life alongside those who loved him most, I quickly posted a lengthy tribute to him to my followers. At the time I thought, “How will they remember him if I don’t acknowledge his passing?”
Perhaps at the root of this anxiety over being forgotten is an urgent question of how one ought to form a legacy; with the rise of automation, a widening wealth gap and an unstable political climate, it is easy to feel unimportant. It is almost as if the world is too big and we are much too small to excel in it in any meaningful way. We feel we need as many people as possible to witness our lives, so as not to be left out of a story that is being written too fast by people much more significant than ourselves.
“The secret of a full life is to live and relate to others as if they might not be there tomorrow, as if you might not be there tomorrow,” the writer Anais Nin said. “This feeling has become a rarity, and rarer every day now that we have reached a hastier and more superficial rhythm, now that we believe we are in touch with a greater amount of people. This is the illusion which might cheat us of being in touch deeply with the one breathing next to us.”
I think of those words and at once any fear of obscurity is eclipsed by much deeper ones — the fear of forgoing the sacred moments of life, of never learning to be completely alone, of not bearing witness to the incredible lives of those who surround me.
I observe the world around me. It is big and moving fast. “What’s happening?” I think to myself.
I’m just beginning to find out.
From:www.nytimes.com/Oct. 1, 2019
T E X T
I Used to Fear Being a Nobody. Then I Left
Social Media.
By Bianca Brooks
“What’s happening?”
I stare blankly at the little box as I try to think of something clever for my first tweet. I settle on what’s at the top of my mind: “My only #fear is being a nobody.” How could I know this exchange would begin a dialogue that would continue nearly every day for the next nine years of my life?
I began using Twitter in 2010 as a newly minted high school freshman. Though it began as a hub for my quirky adolescent thoughts, over the years it became an archive of my emotional and intellectual voice — a kind of virtual display for the evolution of my politics and artistic identity. But after nine years, it was time to close the archive. My wanting to share my every waking thought became eclipsed by a desire for an increasingly rare commodity — a private life.
Though I thought disappearing from social media would be as simple as logging off, my refusal to post anything caused a bit of a stir among my small but loyal following. I began to receive emails from strangers asking me where I had gone and when I would return. One message read: “Not to be over familiar, but you have to come back eventually. You’re a writer after all. How will we read your writing?” Another follower inquired, “Where will you go?”
The truth is I have not gone anywhere. I am, in fact, more present than ever
Over time, I have begun to sense these messages reveal more than a lack of respect for privacy. I realize that to many millennials, a life without a social media presence is not simply a private life; it is no life at all: We possess a widespread, genuine fear of obscurity.
When I consider the near-decade I have spent on social media, this worry makes sense. As with many in my generation, Twitter was my entry into conversations happening on a global scale; long before my byline graced any publication, tweeting was how I felt a part of the world. Twitter functions much like an echo chamber dependent on likes and retweets, and gaining notoriety is as easy as finding someone to agree with you. For years I poured my opinions, musings and outrage onto my timeline, believing I held an indispensable place in a vital sociopolitical experiment.
But these passionate, public observations were born of more than just a desire to speak my mind — I was measuring my individual worth in constant visibility. Implicit in my follower’s question “Where will you go?” is the resounding question “How will we know where you’ve gone?” Privacy is considered a small exchange for the security of being well known and well liked.
After all, a private life boasts no location markers or story updates. The idea that the happenings of our lives would be constrained to our immediate families, friends and real-life communities is akin to social death in a world measured by followers, views, likes and shares.
I grow weary when I think of this as the new normal for what is considered to be a fruitful personal life. Social media is no longer a mere public extension of our private socialization; it has become a replacement for it. What happens to our humanity when we relegate our real lives to props for the performance of our virtual ones?
For one, a predominantly online existence can lull us into a dubious sense of having enacted concrete change, simply because of a tweet or Instagram post. As “hashtag activism” has obscured longstanding traditions of assembly and protest, there’s concern that a failure to transition from the keyboard to in-person organization will effectively stall or kill the momentum of political movements. (See: Occupy Wall Street.)
The sanctity of our most intimate experiences is also diminished. My grandfather Charles Shaw — a notable musician whose wisdoms and jazz scene tales I often shared on Twitter — passed away last year. Rather than take adequate time to privately mourn the loss of his giant influence in my life alongside those who loved him most, I quickly posted a lengthy tribute to him to my followers. At the time I thought, “How will they remember him if I don’t acknowledge his passing?”
Perhaps at the root of this anxiety over being forgotten is an urgent question of how one ought to form a legacy; with the rise of automation, a widening wealth gap and an unstable political climate, it is easy to feel unimportant. It is almost as if the world is too big and we are much too small to excel in it in any meaningful way. We feel we need as many people as possible to witness our lives, so as not to be left out of a story that is being written too fast by people much more significant than ourselves.
“The secret of a full life is to live and relate to others as if they might not be there tomorrow, as if you might not be there tomorrow,” the writer Anais Nin said. “This feeling has become a rarity, and rarer every day now that we have reached a hastier and more superficial rhythm, now that we believe we are in touch with a greater amount of people. This is the illusion which might cheat us of being in touch deeply with the one breathing next to us.”
I think of those words and at once any fear of obscurity is eclipsed by much deeper ones — the fear of forgoing the sacred moments of life, of never learning to be completely alone, of not bearing witness to the incredible lives of those who surround me.
I observe the world around me. It is big and moving fast. “What’s happening?” I think to myself.
I’m just beginning to find out.
From:www.nytimes.com/Oct. 1, 2019
T E X T
I Used to Fear Being a Nobody. Then I Left
Social Media.
By Bianca Brooks
“What’s happening?”
I stare blankly at the little box as I try to think of something clever for my first tweet. I settle on what’s at the top of my mind: “My only #fear is being a nobody.” How could I know this exchange would begin a dialogue that would continue nearly every day for the next nine years of my life?
I began using Twitter in 2010 as a newly minted high school freshman. Though it began as a hub for my quirky adolescent thoughts, over the years it became an archive of my emotional and intellectual voice — a kind of virtual display for the evolution of my politics and artistic identity. But after nine years, it was time to close the archive. My wanting to share my every waking thought became eclipsed by a desire for an increasingly rare commodity — a private life.
Though I thought disappearing from social media would be as simple as logging off, my refusal to post anything caused a bit of a stir among my small but loyal following. I began to receive emails from strangers asking me where I had gone and when I would return. One message read: “Not to be over familiar, but you have to come back eventually. You’re a writer after all. How will we read your writing?” Another follower inquired, “Where will you go?”
The truth is I have not gone anywhere. I am, in fact, more present than ever
Over time, I have begun to sense these messages reveal more than a lack of respect for privacy. I realize that to many millennials, a life without a social media presence is not simply a private life; it is no life at all: We possess a widespread, genuine fear of obscurity.
When I consider the near-decade I have spent on social media, this worry makes sense. As with many in my generation, Twitter was my entry into conversations happening on a global scale; long before my byline graced any publication, tweeting was how I felt a part of the world. Twitter functions much like an echo chamber dependent on likes and retweets, and gaining notoriety is as easy as finding someone to agree with you. For years I poured my opinions, musings and outrage onto my timeline, believing I held an indispensable place in a vital sociopolitical experiment.
But these passionate, public observations were born of more than just a desire to speak my mind — I was measuring my individual worth in constant visibility. Implicit in my follower’s question “Where will you go?” is the resounding question “How will we know where you’ve gone?” Privacy is considered a small exchange for the security of being well known and well liked.
After all, a private life boasts no location markers or story updates. The idea that the happenings of our lives would be constrained to our immediate families, friends and real-life communities is akin to social death in a world measured by followers, views, likes and shares.
I grow weary when I think of this as the new normal for what is considered to be a fruitful personal life. Social media is no longer a mere public extension of our private socialization; it has become a replacement for it. What happens to our humanity when we relegate our real lives to props for the performance of our virtual ones?
For one, a predominantly online existence can lull us into a dubious sense of having enacted concrete change, simply because of a tweet or Instagram post. As “hashtag activism” has obscured longstanding traditions of assembly and protest, there’s concern that a failure to transition from the keyboard to in-person organization will effectively stall or kill the momentum of political movements. (See: Occupy Wall Street.)
The sanctity of our most intimate experiences is also diminished. My grandfather Charles Shaw — a notable musician whose wisdoms and jazz scene tales I often shared on Twitter — passed away last year. Rather than take adequate time to privately mourn the loss of his giant influence in my life alongside those who loved him most, I quickly posted a lengthy tribute to him to my followers. At the time I thought, “How will they remember him if I don’t acknowledge his passing?”
Perhaps at the root of this anxiety over being forgotten is an urgent question of how one ought to form a legacy; with the rise of automation, a widening wealth gap and an unstable political climate, it is easy to feel unimportant. It is almost as if the world is too big and we are much too small to excel in it in any meaningful way. We feel we need as many people as possible to witness our lives, so as not to be left out of a story that is being written too fast by people much more significant than ourselves.
“The secret of a full life is to live and relate to others as if they might not be there tomorrow, as if you might not be there tomorrow,” the writer Anais Nin said. “This feeling has become a rarity, and rarer every day now that we have reached a hastier and more superficial rhythm, now that we believe we are in touch with a greater amount of people. This is the illusion which might cheat us of being in touch deeply with the one breathing next to us.”
I think of those words and at once any fear of obscurity is eclipsed by much deeper ones — the fear of forgoing the sacred moments of life, of never learning to be completely alone, of not bearing witness to the incredible lives of those who surround me.
I observe the world around me. It is big and moving fast. “What’s happening?” I think to myself.
I’m just beginning to find out.
From:www.nytimes.com/Oct. 1, 2019
TEXT
For a $14.95 monthly membership, the website Lumosity promises to “train” your brain with games designed to stave off mental decline. Users view a quick succession of bird images and numbers to test attention span, for instance, or match increasingly complex tile patterns to challenge memory.
While Lumosity is perhaps the best known of the brain-game websites, with 50 million subscribers in 180 countries, the cognitive training business is booming. Happy Neuron of Mountain View, Calif., promises “brain fitness for life.” Cogmed, owned by the British education company Pearson, says its training program will give students “improved attention and capacity for learning.” The Israeli firm NeurMuch of the focus of theonix is developing a brain stimulation and cognitive training program that the company calls a “new hope for Alzheimer’s disease.”
Much of the focus of the brain fitness business has been on helping children with attention-deficit problems, and on improving cognitive function and academic performance in healthy children and adults. An effective way to stave off memory loss or prevent Alzheimer’s — particularly
The problem, Dr. Doraiswa if it were a simple website or video game — is the “holy grail” of neuroscience, said Dr. Murali Doraiswamy, director of the neurocognitive disorders program at Duke Institute for Brain Sciences. The problem, Dr. Doraiswamy added, is that the science of cognitive training has not kept up with the hype. “Almost all the marketing claims made by all the companies go beyond the data,” he said. “We need large national studies before you can conclude that it’s ready for prime time.”
For centuries, scientists believed that most brain development occurred in the first few years of life — that by adulthood the brain was largely immutable. But over the past two decades, studies on animals and humans have found that the brain continues to form new neural connections throughout life. But questions remain whether an intervention that challenges the brain — a puzzle, studying a new language or improving skill on a video game — can really raise intelligence or stave off normal memory loss.
A series of studies in recent years has suggested that certain types of game training can improve a person’s cognitive performance. In February 2013, however, an analysis of 23 of the best studies on brain training, led by the University of Oslo researcher Monica Melby-Lervag, concluded that while players do get better, the increase in skill hasn’t been shown to transfer to other tasks. In other words, playing Sudoku or an online matching game makes you better at the game, but it doesn’t make you better at math or help you remember names or where you left your car keys.
But other studies have been more encouraging. Last September, the journal Nature published a study by researchers at the University of California, San Francisco, that showed a driving game did improve short-term memory and longterm focus in older adults. The findings are significant because the research found that improvements in performance weren’t limited to the game, but also appeared to be linked to a strengthening of older brains over all, helping them to perform better at other memory and attention tasks.
In addition, brain monitoring during the study showed that in older participants, game training led to bursts in brain waves associated with attention; the patterns were similar to those seen in much younger brains.
Earlier this year, the National Institutes of Health invited applications to more rigorously test brain fitness training to stave off cognitive decline. Researchers say they hope the effort will help establish a consistent standard for determining whether a brain-training intervention works.
But while the science remains unclear, entrepreneurs have seized on what is likely to be a sizable marketing opportunity. In May, hundreds of researchers and businesses will gather in San Francisco for the NeuroGaming Conference and Expo to explore the latest research and the newest technology.
While there is no real risk to participating in the many unproven brain-training games available online and through smartphones, experts say, consumers should know that the scientific jury is still out on whether they are really boosting brain health or just paying hundreds of dollars to get better at a game.
“I’m not convinced there is a huge difference between buying a $300 subscription to a gaming company versus you yourself doing challenging things on your own, like attending a lecture or learning an instrument,” Dr. Doraiswamy said. “Each person has to personalize for themselves what they find fun and challenging and what they can stick with.”
From: www.nytimes.com, March 10, 2014
TEXT
For a $14.95 monthly membership, the website Lumosity promises to “train” your brain with games designed to stave off mental decline. Users view a quick succession of bird images and numbers to test attention span, for instance, or match increasingly complex tile patterns to challenge memory.
While Lumosity is perhaps the best known of the brain-game websites, with 50 million subscribers in 180 countries, the cognitive training business is booming. Happy Neuron of Mountain View, Calif., promises “brain fitness for life.” Cogmed, owned by the British education company Pearson, says its training program will give students “improved attention and capacity for learning.” The Israeli firm NeurMuch of the focus of theonix is developing a brain stimulation and cognitive training program that the company calls a “new hope for Alzheimer’s disease.”
Much of the focus of the brain fitness business has been on helping children with attention-deficit problems, and on improving cognitive function and academic performance in healthy children and adults. An effective way to stave off memory loss or prevent Alzheimer’s — particularly
The problem, Dr. Doraiswa if it were a simple website or video game — is the “holy grail” of neuroscience, said Dr. Murali Doraiswamy, director of the neurocognitive disorders program at Duke Institute for Brain Sciences. The problem, Dr. Doraiswamy added, is that the science of cognitive training has not kept up with the hype. “Almost all the marketing claims made by all the companies go beyond the data,” he said. “We need large national studies before you can conclude that it’s ready for prime time.”
For centuries, scientists believed that most brain development occurred in the first few years of life — that by adulthood the brain was largely immutable. But over the past two decades, studies on animals and humans have found that the brain continues to form new neural connections throughout life. But questions remain whether an intervention that challenges the brain — a puzzle, studying a new language or improving skill on a video game — can really raise intelligence or stave off normal memory loss.
A series of studies in recent years has suggested that certain types of game training can improve a person’s cognitive performance. In February 2013, however, an analysis of 23 of the best studies on brain training, led by the University of Oslo researcher Monica Melby-Lervag, concluded that while players do get better, the increase in skill hasn’t been shown to transfer to other tasks. In other words, playing Sudoku or an online matching game makes you better at the game, but it doesn’t make you better at math or help you remember names or where you left your car keys.
But other studies have been more encouraging. Last September, the journal Nature published a study by researchers at the University of California, San Francisco, that showed a driving game did improve short-term memory and longterm focus in older adults. The findings are significant because the research found that improvements in performance weren’t limited to the game, but also appeared to be linked to a strengthening of older brains over all, helping them to perform better at other memory and attention tasks.
In addition, brain monitoring during the study showed that in older participants, game training led to bursts in brain waves associated with attention; the patterns were similar to those seen in much younger brains.
Earlier this year, the National Institutes of Health invited applications to more rigorously test brain fitness training to stave off cognitive decline. Researchers say they hope the effort will help establish a consistent standard for determining whether a brain-training intervention works.
But while the science remains unclear, entrepreneurs have seized on what is likely to be a sizable marketing opportunity. In May, hundreds of researchers and businesses will gather in San Francisco for the NeuroGaming Conference and Expo to explore the latest research and the newest technology.
While there is no real risk to participating in the many unproven brain-training games available online and through smartphones, experts say, consumers should know that the scientific jury is still out on whether they are really boosting brain health or just paying hundreds of dollars to get better at a game.
“I’m not convinced there is a huge difference between buying a $300 subscription to a gaming company versus you yourself doing challenging things on your own, like attending a lecture or learning an instrument,” Dr. Doraiswamy said. “Each person has to personalize for themselves what they find fun and challenging and what they can stick with.”
From: www.nytimes.com, March 10, 2014
TEXT
For a $14.95 monthly membership, the website Lumosity promises to “train” your brain with games designed to stave off mental decline. Users view a quick succession of bird images and numbers to test attention span, for instance, or match increasingly complex tile patterns to challenge memory.
While Lumosity is perhaps the best known of the brain-game websites, with 50 million subscribers in 180 countries, the cognitive training business is booming. Happy Neuron of Mountain View, Calif., promises “brain fitness for life.” Cogmed, owned by the British education company Pearson, says its training program will give students “improved attention and capacity for learning.” The Israeli firm NeurMuch of the focus of theonix is developing a brain stimulation and cognitive training program that the company calls a “new hope for Alzheimer’s disease.”
Much of the focus of the brain fitness business has been on helping children with attention-deficit problems, and on improving cognitive function and academic performance in healthy children and adults. An effective way to stave off memory loss or prevent Alzheimer’s — particularly
The problem, Dr. Doraiswa if it were a simple website or video game — is the “holy grail” of neuroscience, said Dr. Murali Doraiswamy, director of the neurocognitive disorders program at Duke Institute for Brain Sciences. The problem, Dr. Doraiswamy added, is that the science of cognitive training has not kept up with the hype. “Almost all the marketing claims made by all the companies go beyond the data,” he said. “We need large national studies before you can conclude that it’s ready for prime time.”
For centuries, scientists believed that most brain development occurred in the first few years of life — that by adulthood the brain was largely immutable. But over the past two decades, studies on animals and humans have found that the brain continues to form new neural connections throughout life. But questions remain whether an intervention that challenges the brain — a puzzle, studying a new language or improving skill on a video game — can really raise intelligence or stave off normal memory loss.
A series of studies in recent years has suggested that certain types of game training can improve a person’s cognitive performance. In February 2013, however, an analysis of 23 of the best studies on brain training, led by the University of Oslo researcher Monica Melby-Lervag, concluded that while players do get better, the increase in skill hasn’t been shown to transfer to other tasks. In other words, playing Sudoku or an online matching game makes you better at the game, but it doesn’t make you better at math or help you remember names or where you left your car keys.
But other studies have been more encouraging. Last September, the journal Nature published a study by researchers at the University of California, San Francisco, that showed a driving game did improve short-term memory and longterm focus in older adults. The findings are significant because the research found that improvements in performance weren’t limited to the game, but also appeared to be linked to a strengthening of older brains over all, helping them to perform better at other memory and attention tasks.
In addition, brain monitoring during the study showed that in older participants, game training led to bursts in brain waves associated with attention; the patterns were similar to those seen in much younger brains.
Earlier this year, the National Institutes of Health invited applications to more rigorously test brain fitness training to stave off cognitive decline. Researchers say they hope the effort will help establish a consistent standard for determining whether a brain-training intervention works.
But while the science remains unclear, entrepreneurs have seized on what is likely to be a sizable marketing opportunity. In May, hundreds of researchers and businesses will gather in San Francisco for the NeuroGaming Conference and Expo to explore the latest research and the newest technology.
While there is no real risk to participating in the many unproven brain-training games available online and through smartphones, experts say, consumers should know that the scientific jury is still out on whether they are really boosting brain health or just paying hundreds of dollars to get better at a game.
“I’m not convinced there is a huge difference between buying a $300 subscription to a gaming company versus you yourself doing challenging things on your own, like attending a lecture or learning an instrument,” Dr. Doraiswamy said. “Each person has to personalize for themselves what they find fun and challenging and what they can stick with.”
From: www.nytimes.com, March 10, 2014
TEXT
For a $14.95 monthly membership, the website Lumosity promises to “train” your brain with games designed to stave off mental decline. Users view a quick succession of bird images and numbers to test attention span, for instance, or match increasingly complex tile patterns to challenge memory.
While Lumosity is perhaps the best known of the brain-game websites, with 50 million subscribers in 180 countries, the cognitive training business is booming. Happy Neuron of Mountain View, Calif., promises “brain fitness for life.” Cogmed, owned by the British education company Pearson, says its training program will give students “improved attention and capacity for learning.” The Israeli firm NeurMuch of the focus of theonix is developing a brain stimulation and cognitive training program that the company calls a “new hope for Alzheimer’s disease.”
Much of the focus of the brain fitness business has been on helping children with attention-deficit problems, and on improving cognitive function and academic performance in healthy children and adults. An effective way to stave off memory loss or prevent Alzheimer’s — particularly
The problem, Dr. Doraiswa if it were a simple website or video game — is the “holy grail” of neuroscience, said Dr. Murali Doraiswamy, director of the neurocognitive disorders program at Duke Institute for Brain Sciences. The problem, Dr. Doraiswamy added, is that the science of cognitive training has not kept up with the hype. “Almost all the marketing claims made by all the companies go beyond the data,” he said. “We need large national studies before you can conclude that it’s ready for prime time.”
For centuries, scientists believed that most brain development occurred in the first few years of life — that by adulthood the brain was largely immutable. But over the past two decades, studies on animals and humans have found that the brain continues to form new neural connections throughout life. But questions remain whether an intervention that challenges the brain — a puzzle, studying a new language or improving skill on a video game — can really raise intelligence or stave off normal memory loss.
A series of studies in recent years has suggested that certain types of game training can improve a person’s cognitive performance. In February 2013, however, an analysis of 23 of the best studies on brain training, led by the University of Oslo researcher Monica Melby-Lervag, concluded that while players do get better, the increase in skill hasn’t been shown to transfer to other tasks. In other words, playing Sudoku or an online matching game makes you better at the game, but it doesn’t make you better at math or help you remember names or where you left your car keys.
But other studies have been more encouraging. Last September, the journal Nature published a study by researchers at the University of California, San Francisco, that showed a driving game did improve short-term memory and longterm focus in older adults. The findings are significant because the research found that improvements in performance weren’t limited to the game, but also appeared to be linked to a strengthening of older brains over all, helping them to perform better at other memory and attention tasks.
In addition, brain monitoring during the study showed that in older participants, game training led to bursts in brain waves associated with attention; the patterns were similar to those seen in much younger brains.
Earlier this year, the National Institutes of Health invited applications to more rigorously test brain fitness training to stave off cognitive decline. Researchers say they hope the effort will help establish a consistent standard for determining whether a brain-training intervention works.
But while the science remains unclear, entrepreneurs have seized on what is likely to be a sizable marketing opportunity. In May, hundreds of researchers and businesses will gather in San Francisco for the NeuroGaming Conference and Expo to explore the latest research and the newest technology.
While there is no real risk to participating in the many unproven brain-training games available online and through smartphones, experts say, consumers should know that the scientific jury is still out on whether they are really boosting brain health or just paying hundreds of dollars to get better at a game.
“I’m not convinced there is a huge difference between buying a $300 subscription to a gaming company versus you yourself doing challenging things on your own, like attending a lecture or learning an instrument,” Dr. Doraiswamy said. “Each person has to personalize for themselves what they find fun and challenging and what they can stick with.”
From: www.nytimes.com, March 10, 2014
TEXT
For a $14.95 monthly membership, the website Lumosity promises to “train” your brain with games designed to stave off mental decline. Users view a quick succession of bird images and numbers to test attention span, for instance, or match increasingly complex tile patterns to challenge memory.
While Lumosity is perhaps the best known of the brain-game websites, with 50 million subscribers in 180 countries, the cognitive training business is booming. Happy Neuron of Mountain View, Calif., promises “brain fitness for life.” Cogmed, owned by the British education company Pearson, says its training program will give students “improved attention and capacity for learning.” The Israeli firm NeurMuch of the focus of theonix is developing a brain stimulation and cognitive training program that the company calls a “new hope for Alzheimer’s disease.”
Much of the focus of the brain fitness business has been on helping children with attention-deficit problems, and on improving cognitive function and academic performance in healthy children and adults. An effective way to stave off memory loss or prevent Alzheimer’s — particularly
The problem, Dr. Doraiswa if it were a simple website or video game — is the “holy grail” of neuroscience, said Dr. Murali Doraiswamy, director of the neurocognitive disorders program at Duke Institute for Brain Sciences. The problem, Dr. Doraiswamy added, is that the science of cognitive training has not kept up with the hype. “Almost all the marketing claims made by all the companies go beyond the data,” he said. “We need large national studies before you can conclude that it’s ready for prime time.”
For centuries, scientists believed that most brain development occurred in the first few years of life — that by adulthood the brain was largely immutable. But over the past two decades, studies on animals and humans have found that the brain continues to form new neural connections throughout life. But questions remain whether an intervention that challenges the brain — a puzzle, studying a new language or improving skill on a video game — can really raise intelligence or stave off normal memory loss.
A series of studies in recent years has suggested that certain types of game training can improve a person’s cognitive performance. In February 2013, however, an analysis of 23 of the best studies on brain training, led by the University of Oslo researcher Monica Melby-Lervag, concluded that while players do get better, the increase in skill hasn’t been shown to transfer to other tasks. In other words, playing Sudoku or an online matching game makes you better at the game, but it doesn’t make you better at math or help you remember names or where you left your car keys.
But other studies have been more encouraging. Last September, the journal Nature published a study by researchers at the University of California, San Francisco, that showed a driving game did improve short-term memory and longterm focus in older adults. The findings are significant because the research found that improvements in performance weren’t limited to the game, but also appeared to be linked to a strengthening of older brains over all, helping them to perform better at other memory and attention tasks.
In addition, brain monitoring during the study showed that in older participants, game training led to bursts in brain waves associated with attention; the patterns were similar to those seen in much younger brains.
Earlier this year, the National Institutes of Health invited applications to more rigorously test brain fitness training to stave off cognitive decline. Researchers say they hope the effort will help establish a consistent standard for determining whether a brain-training intervention works.
But while the science remains unclear, entrepreneurs have seized on what is likely to be a sizable marketing opportunity. In May, hundreds of researchers and businesses will gather in San Francisco for the NeuroGaming Conference and Expo to explore the latest research and the newest technology.
While there is no real risk to participating in the many unproven brain-training games available online and through smartphones, experts say, consumers should know that the scientific jury is still out on whether they are really boosting brain health or just paying hundreds of dollars to get better at a game.
“I’m not convinced there is a huge difference between buying a $300 subscription to a gaming company versus you yourself doing challenging things on your own, like attending a lecture or learning an instrument,” Dr. Doraiswamy said. “Each person has to personalize for themselves what they find fun and challenging and what they can stick with.”
From: www.nytimes.com, March 10, 2014
TEXT
For a $14.95 monthly membership, the website Lumosity promises to “train” your brain with games designed to stave off mental decline. Users view a quick succession of bird images and numbers to test attention span, for instance, or match increasingly complex tile patterns to challenge memory.
While Lumosity is perhaps the best known of the brain-game websites, with 50 million subscribers in 180 countries, the cognitive training business is booming. Happy Neuron of Mountain View, Calif., promises “brain fitness for life.” Cogmed, owned by the British education company Pearson, says its training program will give students “improved attention and capacity for learning.” The Israeli firm NeurMuch of the focus of theonix is developing a brain stimulation and cognitive training program that the company calls a “new hope for Alzheimer’s disease.”
Much of the focus of the brain fitness business has been on helping children with attention-deficit problems, and on improving cognitive function and academic performance in healthy children and adults. An effective way to stave off memory loss or prevent Alzheimer’s — particularly
The problem, Dr. Doraiswa if it were a simple website or video game — is the “holy grail” of neuroscience, said Dr. Murali Doraiswamy, director of the neurocognitive disorders program at Duke Institute for Brain Sciences. The problem, Dr. Doraiswamy added, is that the science of cognitive training has not kept up with the hype. “Almost all the marketing claims made by all the companies go beyond the data,” he said. “We need large national studies before you can conclude that it’s ready for prime time.”
For centuries, scientists believed that most brain development occurred in the first few years of life — that by adulthood the brain was largely immutable. But over the past two decades, studies on animals and humans have found that the brain continues to form new neural connections throughout life. But questions remain whether an intervention that challenges the brain — a puzzle, studying a new language or improving skill on a video game — can really raise intelligence or stave off normal memory loss.
A series of studies in recent years has suggested that certain types of game training can improve a person’s cognitive performance. In February 2013, however, an analysis of 23 of the best studies on brain training, led by the University of Oslo researcher Monica Melby-Lervag, concluded that while players do get better, the increase in skill hasn’t been shown to transfer to other tasks. In other words, playing Sudoku or an online matching game makes you better at the game, but it doesn’t make you better at math or help you remember names or where you left your car keys.
But other studies have been more encouraging. Last September, the journal Nature published a study by researchers at the University of California, San Francisco, that showed a driving game did improve short-term memory and longterm focus in older adults. The findings are significant because the research found that improvements in performance weren’t limited to the game, but also appeared to be linked to a strengthening of older brains over all, helping them to perform better at other memory and attention tasks.
In addition, brain monitoring during the study showed that in older participants, game training led to bursts in brain waves associated with attention; the patterns were similar to those seen in much younger brains.
Earlier this year, the National Institutes of Health invited applications to more rigorously test brain fitness training to stave off cognitive decline. Researchers say they hope the effort will help establish a consistent standard for determining whether a brain-training intervention works.
But while the science remains unclear, entrepreneurs have seized on what is likely to be a sizable marketing opportunity. In May, hundreds of researchers and businesses will gather in San Francisco for the NeuroGaming Conference and Expo to explore the latest research and the newest technology.
While there is no real risk to participating in the many unproven brain-training games available online and through smartphones, experts say, consumers should know that the scientific jury is still out on whether they are really boosting brain health or just paying hundreds of dollars to get better at a game.
“I’m not convinced there is a huge difference between buying a $300 subscription to a gaming company versus you yourself doing challenging things on your own, like attending a lecture or learning an instrument,” Dr. Doraiswamy said. “Each person has to personalize for themselves what they find fun and challenging and what they can stick with.”
From: www.nytimes.com, March 10, 2014