Questões de Vestibular de Inglês - Interpretação de texto | Reading comprehension

Foram encontradas 4.863 questões

Ano: 2014 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2014 - UECE - Vestibular - Língua Inglesa - 1ª Fase - 2015.1 |
Q1280526 Inglês

TEXT

    The global mortality rate for children younger than 5 has dropped by nearly half since 1990, the United Nations said Tuesday in an annual report on progress aimed at ensuring child survival, but the decline still falls short of meeting the organization’s goal of a two-thirds reduction by next year. Without accelerated improvements in reducing health risks to young children, the report said, that goal will not be reached until 2026, 11 years behind schedule.

    Nearly all of the countries with the highest mortality rates are in Africa, the report said, and two countries that are among the world’s most populous — India and Nigeria — account for nearly a third of all deaths among children younger than 5.

    A collaboration of Unicef, other United Nations agencies and the World Bank, the report provides a barometer of health care and nutrition in every country. A child mortality rate can be a potent indicator of other elements in a country’s basic quality of life.

    The report showed that the mortality rate for children younger than 5, the most vulnerable period, fell to 46 deaths per 1,000 live births last year, from 90 per 1,000 births in 1990. It also showed that the gap in mortality rates between the richest and poorest households had fallen in all regions over most of the past two decades, except for sub-Saharan Africa.

    The report attributed much of the progress to broad interventions over the years against leading infectious diseases in some of the most impoverished regions, including immunizations and the use of insecticide-treated mosquito nets, as well improvements in health care to expectant mothers and in battling the effects of diarrhea and other dehydrating maladies that pose acute risks to the young.

    “There has been dramatic and accelerating progress in reducing mortality among children, and the data prove that success is possible even for poorly resourced countries,” Dr. Mickey Chopra, the head of global health programs for Unicef, said in a statement about the report’s conclusions.

    Geeta Rao Gupta, Unicef’s deputy executive director, said, “The data clearly demonstrate that an infant’s chances of survival increase dramatically when their mother has sustained access to quality health care during pregnancy and delivery.”

    Despite the advances, from 1990 and 2013, 223 million children worldwide died before their fifth birthday, a number that the report called “staggering.” In 2013, the report said, 6.3 million children younger than 5 died, 200,000 fewer than the year before. Nonetheless, that is still the equivalent of about 17,000 child deaths a day, largely attributable to preventable causes that include insufficient nutrition; complications during pregnancy, labor and delivery; pneumonia; diarrhea; and malaria.

     While sub-Saharan Africa has reduced the under-5 mortality rate by 48 percent since 1990, the report said, the region still has the world’s highest rate: 92 deaths per 1,000 live births, nearly 15 times the average in the most affluent countries. Put another way, the report said, children born in Angola, which has the world’s highest rate — 167 deaths per 1,000 live births — are 84 times as likely to die before they turn 5 as children born in Luxembourg, with the lowest rate — two per 1,000.

    The report noted that “a child’s risk of dying increases if she or he is born in a remote rural area, into a poor household or to a mother with no education.”

From: www.nytimes.com Sept. 16, 2014

Although the United Nations annual report shows mortality rate for children under 5 has dropped considerably worldwide, it is crucial to note that

Alternativas
Ano: 2019 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2019 - UECE - Vestibular - Língua Inglesa 2ªfase |
Q1280289 Inglês

TEXTO

The Future Of Work: 5 Important Ways Jobs

Will Change In The 4th Industrial Revolution


Fonte:

https://www.forbes.com/2019/07/15

As to what employees could do to prepare for so many changes that are already happening, the text suggests, among other things,
Alternativas
Ano: 2019 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2019 - UECE - Vestibular - Língua Inglesa 2ªfase |
Q1280288 Inglês

TEXTO

The Future Of Work: 5 Important Ways Jobs

Will Change In The 4th Industrial Revolution


Fonte:

https://www.forbes.com/2019/07/15

Still about the role of employers, the text mentions they will have to adjust the way they operate so that they are able to attract talented people to work in their company by, for instance,
Alternativas
Ano: 2019 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2019 - UECE - Vestibular - Língua Inglesa 2ªfase |
Q1280287 Inglês

TEXTO

The Future Of Work: 5 Important Ways Jobs

Will Change In The 4th Industrial Revolution


Fonte:

https://www.forbes.com/2019/07/15

Amongst the transformations companies will go through, the text highlights a set of skills employers should be searching for when hiring new employees. It would be the ones that
Alternativas
Ano: 2019 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2019 - UECE - Vestibular - Língua Inglesa 2ªfase |
Q1280286 Inglês

TEXTO

The Future Of Work: 5 Important Ways Jobs

Will Change In The 4th Industrial Revolution


Fonte:

https://www.forbes.com/2019/07/15

As to the presence of intelligent machines at the workplace, the text argues that it can
Alternativas
Ano: 2019 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2019 - UECE - Vestibular - Língua Inglesa 2ªfase |
Q1280285 Inglês

TEXTO

The Future Of Work: 5 Important Ways Jobs

Will Change In The 4th Industrial Revolution


Fonte:

https://www.forbes.com/2019/07/15

In relation to the fluidity of positions in a company, the text mentions that this change would be an attractive feature mainly to
Alternativas
Ano: 2019 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2019 - UECE - Vestibular - Língua Inglesa 2ªfase |
Q1280284 Inglês

TEXTO

The Future Of Work: 5 Important Ways Jobs

Will Change In The 4th Industrial Revolution


Fonte:

https://www.forbes.com/2019/07/15

Among the ways in which jobs will change, the text mentions
Alternativas
Ano: 2019 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2019 - UECE - Vestibular - Língua Inglesa 2ªfase |
Q1280283 Inglês

TEXTO

The Future Of Work: 5 Important Ways Jobs

Will Change In The 4th Industrial Revolution


Fonte:

https://www.forbes.com/2019/07/15

A report by the McKinsey Global Institute dealing with automation at work has brought evidence that
Alternativas
Ano: 2019 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2019 - UECE - Vestibular - Língua Inglesa 2ªfase |
Q1280282 Inglês

TEXTO

The Future Of Work: 5 Important Ways Jobs

Will Change In The 4th Industrial Revolution


Fonte:

https://www.forbes.com/2019/07/15

According to the text, questions related to the changes of how we work have evolved in such a way that it can be
Alternativas
Ano: 2019 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2019 - UECE - Vestibular - Língua Inglesa |
Q1280161 Inglês

T E X T


I Used to Fear Being a Nobody. Then I Left

Social Media.


By Bianca Brooks


“What’s happening?”

     I stare blankly at the little box as I try to think of something clever for my first tweet. I settle on what’s at the top of my mind: “My only #fear is being a nobody.” How could I know this exchange would begin a dialogue that would continue nearly every day for the next nine years of my life?

     I began using Twitter in 2010 as a newly minted high school freshman. Though it began as a hub for my quirky adolescent thoughts, over the years it became an archive of my emotional and intellectual voice — a kind of virtual display for the evolution of my politics and artistic identity. But after nine years, it was time to close the archive. My wanting to share my every waking thought became eclipsed by a desire for an increasingly rare commodity — a private life.

     Though I thought disappearing from social media would be as simple as logging off, my refusal to post anything caused a bit of a stir among my small but loyal following. I began to receive emails from strangers asking me where I had gone and when I would return. One message read: “Not to be over familiar, but you have to come back eventually. You’re a writer after all. How will we read your writing?” Another follower inquired, “Where will you go?” 

     The truth is I have not gone anywhere. I am, in fact, more present than ever

     Over time, I have begun to sense these messages reveal more than a lack of respect for privacy. I realize that to many millennials, a life without a social media presence is not simply a private life; it is no life at all: We possess a widespread, genuine fear of obscurity.

     When I consider the near-decade I have spent on social media, this worry makes sense. As with many in my generation, Twitter was my entry into conversations happening on a global scale; long before my byline graced any publication, tweeting was how I felt a part of the world. Twitter functions much like an echo chamber dependent on likes and retweets, and gaining notoriety is as easy as finding someone to agree with you. For years I poured my opinions, musings and outrage onto my timeline, believing I held an indispensable place in a vital sociopolitical experiment. 

     But these passionate, public observations were born of more than just a desire to speak my mind — I was measuring my individual worth in constant visibility. Implicit in my follower’s question “Where will you go?” is the resounding question “How will we know where you’ve gone?” Privacy is considered a small exchange for the security of being well known and well liked. 

     After all, a private life boasts no location markers or story updates. The idea that the happenings of our lives would be constrained to our immediate families, friends and real-life communities is akin to social death in a world measured by followers, views, likes and shares.

     I grow weary when I think of this as the new normal for what is considered to be a fruitful personal life. Social media is no longer a mere public extension of our private socialization; it has become a replacement for it. What happens to our humanity when we relegate our real lives to props for the performance of our virtual ones? 

     For one, a predominantly online existence can lull us into a dubious sense of having enacted concrete change, simply because of a tweet or Instagram post. As “hashtag activism” has obscured longstanding traditions of assembly and protest, there’s concern that a failure to transition from the keyboard to in-person organization will effectively stall or kill the momentum of political movements. (See: Occupy Wall Street.) 

     The sanctity of our most intimate experiences is also diminished. My grandfather Charles Shaw — a notable musician whose wisdoms and jazz scene tales I often shared on Twitter — passed away last year. Rather than take adequate time to privately mourn the loss of his giant influence in my life alongside those who loved him most, I quickly posted a lengthy tribute to him to my followers. At the time I thought, “How will they remember him if I don’t acknowledge his passing?”

     Perhaps at the root of this anxiety over being forgotten is an urgent question of how one ought to form a legacy; with the rise of automation, a widening wealth gap and an unstable political climate, it is easy to feel unimportant. It is almost as if the world is too big and we are much too small to excel in it in any meaningful way. We feel we need as many people as possible to witness our lives, so as not to be left out of a story that is being written too fast by people much more significant than ourselves.

     “The secret of a full life is to live and relate to others as if they might not be there tomorrow, as if you might not be there tomorrow,” the writer Anais Nin said. “This feeling has become a rarity, and rarer every day now that we have reached a hastier and more superficial rhythm, now that we believe we are in touch with a greater amount of people. This is the illusion which might cheat us of being in touch deeply with the one breathing next to us.”

     I think of those words and at once any fear of obscurity is eclipsed by much deeper ones — the fear of forgoing the sacred moments of life, of never learning to be completely alone, of not bearing witness to the incredible lives of those who surround me.

     I observe the world around me. It is big and moving fast. “What’s happening?” I think to myself. 

     I’m just beginning to find out. 


From:www.nytimes.com/Oct. 1, 2019

As a concluding note, the author acknowledges that, after leaving social media, she
Alternativas
Ano: 2019 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2019 - UECE - Vestibular - Língua Inglesa |
Q1280160 Inglês

T E X T


I Used to Fear Being a Nobody. Then I Left

Social Media.


By Bianca Brooks


“What’s happening?”

     I stare blankly at the little box as I try to think of something clever for my first tweet. I settle on what’s at the top of my mind: “My only #fear is being a nobody.” How could I know this exchange would begin a dialogue that would continue nearly every day for the next nine years of my life?

     I began using Twitter in 2010 as a newly minted high school freshman. Though it began as a hub for my quirky adolescent thoughts, over the years it became an archive of my emotional and intellectual voice — a kind of virtual display for the evolution of my politics and artistic identity. But after nine years, it was time to close the archive. My wanting to share my every waking thought became eclipsed by a desire for an increasingly rare commodity — a private life.

     Though I thought disappearing from social media would be as simple as logging off, my refusal to post anything caused a bit of a stir among my small but loyal following. I began to receive emails from strangers asking me where I had gone and when I would return. One message read: “Not to be over familiar, but you have to come back eventually. You’re a writer after all. How will we read your writing?” Another follower inquired, “Where will you go?” 

     The truth is I have not gone anywhere. I am, in fact, more present than ever

     Over time, I have begun to sense these messages reveal more than a lack of respect for privacy. I realize that to many millennials, a life without a social media presence is not simply a private life; it is no life at all: We possess a widespread, genuine fear of obscurity.

     When I consider the near-decade I have spent on social media, this worry makes sense. As with many in my generation, Twitter was my entry into conversations happening on a global scale; long before my byline graced any publication, tweeting was how I felt a part of the world. Twitter functions much like an echo chamber dependent on likes and retweets, and gaining notoriety is as easy as finding someone to agree with you. For years I poured my opinions, musings and outrage onto my timeline, believing I held an indispensable place in a vital sociopolitical experiment. 

     But these passionate, public observations were born of more than just a desire to speak my mind — I was measuring my individual worth in constant visibility. Implicit in my follower’s question “Where will you go?” is the resounding question “How will we know where you’ve gone?” Privacy is considered a small exchange for the security of being well known and well liked. 

     After all, a private life boasts no location markers or story updates. The idea that the happenings of our lives would be constrained to our immediate families, friends and real-life communities is akin to social death in a world measured by followers, views, likes and shares.

     I grow weary when I think of this as the new normal for what is considered to be a fruitful personal life. Social media is no longer a mere public extension of our private socialization; it has become a replacement for it. What happens to our humanity when we relegate our real lives to props for the performance of our virtual ones? 

     For one, a predominantly online existence can lull us into a dubious sense of having enacted concrete change, simply because of a tweet or Instagram post. As “hashtag activism” has obscured longstanding traditions of assembly and protest, there’s concern that a failure to transition from the keyboard to in-person organization will effectively stall or kill the momentum of political movements. (See: Occupy Wall Street.) 

     The sanctity of our most intimate experiences is also diminished. My grandfather Charles Shaw — a notable musician whose wisdoms and jazz scene tales I often shared on Twitter — passed away last year. Rather than take adequate time to privately mourn the loss of his giant influence in my life alongside those who loved him most, I quickly posted a lengthy tribute to him to my followers. At the time I thought, “How will they remember him if I don’t acknowledge his passing?”

     Perhaps at the root of this anxiety over being forgotten is an urgent question of how one ought to form a legacy; with the rise of automation, a widening wealth gap and an unstable political climate, it is easy to feel unimportant. It is almost as if the world is too big and we are much too small to excel in it in any meaningful way. We feel we need as many people as possible to witness our lives, so as not to be left out of a story that is being written too fast by people much more significant than ourselves.

     “The secret of a full life is to live and relate to others as if they might not be there tomorrow, as if you might not be there tomorrow,” the writer Anais Nin said. “This feeling has become a rarity, and rarer every day now that we have reached a hastier and more superficial rhythm, now that we believe we are in touch with a greater amount of people. This is the illusion which might cheat us of being in touch deeply with the one breathing next to us.”

     I think of those words and at once any fear of obscurity is eclipsed by much deeper ones — the fear of forgoing the sacred moments of life, of never learning to be completely alone, of not bearing witness to the incredible lives of those who surround me.

     I observe the world around me. It is big and moving fast. “What’s happening?” I think to myself. 

     I’m just beginning to find out. 


From:www.nytimes.com/Oct. 1, 2019

Considering the idea of living a “full life”, Bianca Brooks believes that the fast and superficial rhythm of today’s reality may prevent us from
Alternativas
Ano: 2019 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2019 - UECE - Vestibular - Língua Inglesa |
Q1280159 Inglês

T E X T


I Used to Fear Being a Nobody. Then I Left

Social Media.


By Bianca Brooks


“What’s happening?”

     I stare blankly at the little box as I try to think of something clever for my first tweet. I settle on what’s at the top of my mind: “My only #fear is being a nobody.” How could I know this exchange would begin a dialogue that would continue nearly every day for the next nine years of my life?

     I began using Twitter in 2010 as a newly minted high school freshman. Though it began as a hub for my quirky adolescent thoughts, over the years it became an archive of my emotional and intellectual voice — a kind of virtual display for the evolution of my politics and artistic identity. But after nine years, it was time to close the archive. My wanting to share my every waking thought became eclipsed by a desire for an increasingly rare commodity — a private life.

     Though I thought disappearing from social media would be as simple as logging off, my refusal to post anything caused a bit of a stir among my small but loyal following. I began to receive emails from strangers asking me where I had gone and when I would return. One message read: “Not to be over familiar, but you have to come back eventually. You’re a writer after all. How will we read your writing?” Another follower inquired, “Where will you go?” 

     The truth is I have not gone anywhere. I am, in fact, more present than ever

     Over time, I have begun to sense these messages reveal more than a lack of respect for privacy. I realize that to many millennials, a life without a social media presence is not simply a private life; it is no life at all: We possess a widespread, genuine fear of obscurity.

     When I consider the near-decade I have spent on social media, this worry makes sense. As with many in my generation, Twitter was my entry into conversations happening on a global scale; long before my byline graced any publication, tweeting was how I felt a part of the world. Twitter functions much like an echo chamber dependent on likes and retweets, and gaining notoriety is as easy as finding someone to agree with you. For years I poured my opinions, musings and outrage onto my timeline, believing I held an indispensable place in a vital sociopolitical experiment. 

     But these passionate, public observations were born of more than just a desire to speak my mind — I was measuring my individual worth in constant visibility. Implicit in my follower’s question “Where will you go?” is the resounding question “How will we know where you’ve gone?” Privacy is considered a small exchange for the security of being well known and well liked. 

     After all, a private life boasts no location markers or story updates. The idea that the happenings of our lives would be constrained to our immediate families, friends and real-life communities is akin to social death in a world measured by followers, views, likes and shares.

     I grow weary when I think of this as the new normal for what is considered to be a fruitful personal life. Social media is no longer a mere public extension of our private socialization; it has become a replacement for it. What happens to our humanity when we relegate our real lives to props for the performance of our virtual ones? 

     For one, a predominantly online existence can lull us into a dubious sense of having enacted concrete change, simply because of a tweet or Instagram post. As “hashtag activism” has obscured longstanding traditions of assembly and protest, there’s concern that a failure to transition from the keyboard to in-person organization will effectively stall or kill the momentum of political movements. (See: Occupy Wall Street.) 

     The sanctity of our most intimate experiences is also diminished. My grandfather Charles Shaw — a notable musician whose wisdoms and jazz scene tales I often shared on Twitter — passed away last year. Rather than take adequate time to privately mourn the loss of his giant influence in my life alongside those who loved him most, I quickly posted a lengthy tribute to him to my followers. At the time I thought, “How will they remember him if I don’t acknowledge his passing?”

     Perhaps at the root of this anxiety over being forgotten is an urgent question of how one ought to form a legacy; with the rise of automation, a widening wealth gap and an unstable political climate, it is easy to feel unimportant. It is almost as if the world is too big and we are much too small to excel in it in any meaningful way. We feel we need as many people as possible to witness our lives, so as not to be left out of a story that is being written too fast by people much more significant than ourselves.

     “The secret of a full life is to live and relate to others as if they might not be there tomorrow, as if you might not be there tomorrow,” the writer Anais Nin said. “This feeling has become a rarity, and rarer every day now that we have reached a hastier and more superficial rhythm, now that we believe we are in touch with a greater amount of people. This is the illusion which might cheat us of being in touch deeply with the one breathing next to us.”

     I think of those words and at once any fear of obscurity is eclipsed by much deeper ones — the fear of forgoing the sacred moments of life, of never learning to be completely alone, of not bearing witness to the incredible lives of those who surround me.

     I observe the world around me. It is big and moving fast. “What’s happening?” I think to myself. 

     I’m just beginning to find out. 


From:www.nytimes.com/Oct. 1, 2019

The author thinks that always being on social media may reduce the holiness of intimate experiences and she exemplifies that by describing her attitude
Alternativas
Ano: 2019 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2019 - UECE - Vestibular - Língua Inglesa |
Q1280158 Inglês

T E X T


I Used to Fear Being a Nobody. Then I Left

Social Media.


By Bianca Brooks


“What’s happening?”

     I stare blankly at the little box as I try to think of something clever for my first tweet. I settle on what’s at the top of my mind: “My only #fear is being a nobody.” How could I know this exchange would begin a dialogue that would continue nearly every day for the next nine years of my life?

     I began using Twitter in 2010 as a newly minted high school freshman. Though it began as a hub for my quirky adolescent thoughts, over the years it became an archive of my emotional and intellectual voice — a kind of virtual display for the evolution of my politics and artistic identity. But after nine years, it was time to close the archive. My wanting to share my every waking thought became eclipsed by a desire for an increasingly rare commodity — a private life.

     Though I thought disappearing from social media would be as simple as logging off, my refusal to post anything caused a bit of a stir among my small but loyal following. I began to receive emails from strangers asking me where I had gone and when I would return. One message read: “Not to be over familiar, but you have to come back eventually. You’re a writer after all. How will we read your writing?” Another follower inquired, “Where will you go?” 

     The truth is I have not gone anywhere. I am, in fact, more present than ever

     Over time, I have begun to sense these messages reveal more than a lack of respect for privacy. I realize that to many millennials, a life without a social media presence is not simply a private life; it is no life at all: We possess a widespread, genuine fear of obscurity.

     When I consider the near-decade I have spent on social media, this worry makes sense. As with many in my generation, Twitter was my entry into conversations happening on a global scale; long before my byline graced any publication, tweeting was how I felt a part of the world. Twitter functions much like an echo chamber dependent on likes and retweets, and gaining notoriety is as easy as finding someone to agree with you. For years I poured my opinions, musings and outrage onto my timeline, believing I held an indispensable place in a vital sociopolitical experiment. 

     But these passionate, public observations were born of more than just a desire to speak my mind — I was measuring my individual worth in constant visibility. Implicit in my follower’s question “Where will you go?” is the resounding question “How will we know where you’ve gone?” Privacy is considered a small exchange for the security of being well known and well liked. 

     After all, a private life boasts no location markers or story updates. The idea that the happenings of our lives would be constrained to our immediate families, friends and real-life communities is akin to social death in a world measured by followers, views, likes and shares.

     I grow weary when I think of this as the new normal for what is considered to be a fruitful personal life. Social media is no longer a mere public extension of our private socialization; it has become a replacement for it. What happens to our humanity when we relegate our real lives to props for the performance of our virtual ones? 

     For one, a predominantly online existence can lull us into a dubious sense of having enacted concrete change, simply because of a tweet or Instagram post. As “hashtag activism” has obscured longstanding traditions of assembly and protest, there’s concern that a failure to transition from the keyboard to in-person organization will effectively stall or kill the momentum of political movements. (See: Occupy Wall Street.) 

     The sanctity of our most intimate experiences is also diminished. My grandfather Charles Shaw — a notable musician whose wisdoms and jazz scene tales I often shared on Twitter — passed away last year. Rather than take adequate time to privately mourn the loss of his giant influence in my life alongside those who loved him most, I quickly posted a lengthy tribute to him to my followers. At the time I thought, “How will they remember him if I don’t acknowledge his passing?”

     Perhaps at the root of this anxiety over being forgotten is an urgent question of how one ought to form a legacy; with the rise of automation, a widening wealth gap and an unstable political climate, it is easy to feel unimportant. It is almost as if the world is too big and we are much too small to excel in it in any meaningful way. We feel we need as many people as possible to witness our lives, so as not to be left out of a story that is being written too fast by people much more significant than ourselves.

     “The secret of a full life is to live and relate to others as if they might not be there tomorrow, as if you might not be there tomorrow,” the writer Anais Nin said. “This feeling has become a rarity, and rarer every day now that we have reached a hastier and more superficial rhythm, now that we believe we are in touch with a greater amount of people. This is the illusion which might cheat us of being in touch deeply with the one breathing next to us.”

     I think of those words and at once any fear of obscurity is eclipsed by much deeper ones — the fear of forgoing the sacred moments of life, of never learning to be completely alone, of not bearing witness to the incredible lives of those who surround me.

     I observe the world around me. It is big and moving fast. “What’s happening?” I think to myself. 

     I’m just beginning to find out. 


From:www.nytimes.com/Oct. 1, 2019

As to the reasons that lead people to being so much on social media, the author raises the hypothesis that it might be related to a world in which people tend to feel
Alternativas
Ano: 2019 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2019 - UECE - Vestibular - Língua Inglesa |
Q1280157 Inglês

T E X T


I Used to Fear Being a Nobody. Then I Left

Social Media.


By Bianca Brooks


“What’s happening?”

     I stare blankly at the little box as I try to think of something clever for my first tweet. I settle on what’s at the top of my mind: “My only #fear is being a nobody.” How could I know this exchange would begin a dialogue that would continue nearly every day for the next nine years of my life?

     I began using Twitter in 2010 as a newly minted high school freshman. Though it began as a hub for my quirky adolescent thoughts, over the years it became an archive of my emotional and intellectual voice — a kind of virtual display for the evolution of my politics and artistic identity. But after nine years, it was time to close the archive. My wanting to share my every waking thought became eclipsed by a desire for an increasingly rare commodity — a private life.

     Though I thought disappearing from social media would be as simple as logging off, my refusal to post anything caused a bit of a stir among my small but loyal following. I began to receive emails from strangers asking me where I had gone and when I would return. One message read: “Not to be over familiar, but you have to come back eventually. You’re a writer after all. How will we read your writing?” Another follower inquired, “Where will you go?” 

     The truth is I have not gone anywhere. I am, in fact, more present than ever

     Over time, I have begun to sense these messages reveal more than a lack of respect for privacy. I realize that to many millennials, a life without a social media presence is not simply a private life; it is no life at all: We possess a widespread, genuine fear of obscurity.

     When I consider the near-decade I have spent on social media, this worry makes sense. As with many in my generation, Twitter was my entry into conversations happening on a global scale; long before my byline graced any publication, tweeting was how I felt a part of the world. Twitter functions much like an echo chamber dependent on likes and retweets, and gaining notoriety is as easy as finding someone to agree with you. For years I poured my opinions, musings and outrage onto my timeline, believing I held an indispensable place in a vital sociopolitical experiment. 

     But these passionate, public observations were born of more than just a desire to speak my mind — I was measuring my individual worth in constant visibility. Implicit in my follower’s question “Where will you go?” is the resounding question “How will we know where you’ve gone?” Privacy is considered a small exchange for the security of being well known and well liked. 

     After all, a private life boasts no location markers or story updates. The idea that the happenings of our lives would be constrained to our immediate families, friends and real-life communities is akin to social death in a world measured by followers, views, likes and shares.

     I grow weary when I think of this as the new normal for what is considered to be a fruitful personal life. Social media is no longer a mere public extension of our private socialization; it has become a replacement for it. What happens to our humanity when we relegate our real lives to props for the performance of our virtual ones? 

     For one, a predominantly online existence can lull us into a dubious sense of having enacted concrete change, simply because of a tweet or Instagram post. As “hashtag activism” has obscured longstanding traditions of assembly and protest, there’s concern that a failure to transition from the keyboard to in-person organization will effectively stall or kill the momentum of political movements. (See: Occupy Wall Street.) 

     The sanctity of our most intimate experiences is also diminished. My grandfather Charles Shaw — a notable musician whose wisdoms and jazz scene tales I often shared on Twitter — passed away last year. Rather than take adequate time to privately mourn the loss of his giant influence in my life alongside those who loved him most, I quickly posted a lengthy tribute to him to my followers. At the time I thought, “How will they remember him if I don’t acknowledge his passing?”

     Perhaps at the root of this anxiety over being forgotten is an urgent question of how one ought to form a legacy; with the rise of automation, a widening wealth gap and an unstable political climate, it is easy to feel unimportant. It is almost as if the world is too big and we are much too small to excel in it in any meaningful way. We feel we need as many people as possible to witness our lives, so as not to be left out of a story that is being written too fast by people much more significant than ourselves.

     “The secret of a full life is to live and relate to others as if they might not be there tomorrow, as if you might not be there tomorrow,” the writer Anais Nin said. “This feeling has become a rarity, and rarer every day now that we have reached a hastier and more superficial rhythm, now that we believe we are in touch with a greater amount of people. This is the illusion which might cheat us of being in touch deeply with the one breathing next to us.”

     I think of those words and at once any fear of obscurity is eclipsed by much deeper ones — the fear of forgoing the sacred moments of life, of never learning to be completely alone, of not bearing witness to the incredible lives of those who surround me.

     I observe the world around me. It is big and moving fast. “What’s happening?” I think to myself. 

     I’m just beginning to find out. 


From:www.nytimes.com/Oct. 1, 2019

The author states that people are so much into social media that it has
Alternativas
Ano: 2019 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2019 - UECE - Vestibular - Língua Inglesa |
Q1280156 Inglês

T E X T


I Used to Fear Being a Nobody. Then I Left

Social Media.


By Bianca Brooks


“What’s happening?”

     I stare blankly at the little box as I try to think of something clever for my first tweet. I settle on what’s at the top of my mind: “My only #fear is being a nobody.” How could I know this exchange would begin a dialogue that would continue nearly every day for the next nine years of my life?

     I began using Twitter in 2010 as a newly minted high school freshman. Though it began as a hub for my quirky adolescent thoughts, over the years it became an archive of my emotional and intellectual voice — a kind of virtual display for the evolution of my politics and artistic identity. But after nine years, it was time to close the archive. My wanting to share my every waking thought became eclipsed by a desire for an increasingly rare commodity — a private life.

     Though I thought disappearing from social media would be as simple as logging off, my refusal to post anything caused a bit of a stir among my small but loyal following. I began to receive emails from strangers asking me where I had gone and when I would return. One message read: “Not to be over familiar, but you have to come back eventually. You’re a writer after all. How will we read your writing?” Another follower inquired, “Where will you go?” 

     The truth is I have not gone anywhere. I am, in fact, more present than ever

     Over time, I have begun to sense these messages reveal more than a lack of respect for privacy. I realize that to many millennials, a life without a social media presence is not simply a private life; it is no life at all: We possess a widespread, genuine fear of obscurity.

     When I consider the near-decade I have spent on social media, this worry makes sense. As with many in my generation, Twitter was my entry into conversations happening on a global scale; long before my byline graced any publication, tweeting was how I felt a part of the world. Twitter functions much like an echo chamber dependent on likes and retweets, and gaining notoriety is as easy as finding someone to agree with you. For years I poured my opinions, musings and outrage onto my timeline, believing I held an indispensable place in a vital sociopolitical experiment. 

     But these passionate, public observations were born of more than just a desire to speak my mind — I was measuring my individual worth in constant visibility. Implicit in my follower’s question “Where will you go?” is the resounding question “How will we know where you’ve gone?” Privacy is considered a small exchange for the security of being well known and well liked. 

     After all, a private life boasts no location markers or story updates. The idea that the happenings of our lives would be constrained to our immediate families, friends and real-life communities is akin to social death in a world measured by followers, views, likes and shares.

     I grow weary when I think of this as the new normal for what is considered to be a fruitful personal life. Social media is no longer a mere public extension of our private socialization; it has become a replacement for it. What happens to our humanity when we relegate our real lives to props for the performance of our virtual ones? 

     For one, a predominantly online existence can lull us into a dubious sense of having enacted concrete change, simply because of a tweet or Instagram post. As “hashtag activism” has obscured longstanding traditions of assembly and protest, there’s concern that a failure to transition from the keyboard to in-person organization will effectively stall or kill the momentum of political movements. (See: Occupy Wall Street.) 

     The sanctity of our most intimate experiences is also diminished. My grandfather Charles Shaw — a notable musician whose wisdoms and jazz scene tales I often shared on Twitter — passed away last year. Rather than take adequate time to privately mourn the loss of his giant influence in my life alongside those who loved him most, I quickly posted a lengthy tribute to him to my followers. At the time I thought, “How will they remember him if I don’t acknowledge his passing?”

     Perhaps at the root of this anxiety over being forgotten is an urgent question of how one ought to form a legacy; with the rise of automation, a widening wealth gap and an unstable political climate, it is easy to feel unimportant. It is almost as if the world is too big and we are much too small to excel in it in any meaningful way. We feel we need as many people as possible to witness our lives, so as not to be left out of a story that is being written too fast by people much more significant than ourselves.

     “The secret of a full life is to live and relate to others as if they might not be there tomorrow, as if you might not be there tomorrow,” the writer Anais Nin said. “This feeling has become a rarity, and rarer every day now that we have reached a hastier and more superficial rhythm, now that we believe we are in touch with a greater amount of people. This is the illusion which might cheat us of being in touch deeply with the one breathing next to us.”

     I think of those words and at once any fear of obscurity is eclipsed by much deeper ones — the fear of forgoing the sacred moments of life, of never learning to be completely alone, of not bearing witness to the incredible lives of those who surround me.

     I observe the world around me. It is big and moving fast. “What’s happening?” I think to myself. 

     I’m just beginning to find out. 


From:www.nytimes.com/Oct. 1, 2019

For the author herself, Twitter was the platform for important things in her life, including the
Alternativas
Ano: 2019 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2019 - UECE - Vestibular - Língua Inglesa |
Q1280155 Inglês

T E X T


I Used to Fear Being a Nobody. Then I Left

Social Media.


By Bianca Brooks


“What’s happening?”

     I stare blankly at the little box as I try to think of something clever for my first tweet. I settle on what’s at the top of my mind: “My only #fear is being a nobody.” How could I know this exchange would begin a dialogue that would continue nearly every day for the next nine years of my life?

     I began using Twitter in 2010 as a newly minted high school freshman. Though it began as a hub for my quirky adolescent thoughts, over the years it became an archive of my emotional and intellectual voice — a kind of virtual display for the evolution of my politics and artistic identity. But after nine years, it was time to close the archive. My wanting to share my every waking thought became eclipsed by a desire for an increasingly rare commodity — a private life.

     Though I thought disappearing from social media would be as simple as logging off, my refusal to post anything caused a bit of a stir among my small but loyal following. I began to receive emails from strangers asking me where I had gone and when I would return. One message read: “Not to be over familiar, but you have to come back eventually. You’re a writer after all. How will we read your writing?” Another follower inquired, “Where will you go?” 

     The truth is I have not gone anywhere. I am, in fact, more present than ever

     Over time, I have begun to sense these messages reveal more than a lack of respect for privacy. I realize that to many millennials, a life without a social media presence is not simply a private life; it is no life at all: We possess a widespread, genuine fear of obscurity.

     When I consider the near-decade I have spent on social media, this worry makes sense. As with many in my generation, Twitter was my entry into conversations happening on a global scale; long before my byline graced any publication, tweeting was how I felt a part of the world. Twitter functions much like an echo chamber dependent on likes and retweets, and gaining notoriety is as easy as finding someone to agree with you. For years I poured my opinions, musings and outrage onto my timeline, believing I held an indispensable place in a vital sociopolitical experiment. 

     But these passionate, public observations were born of more than just a desire to speak my mind — I was measuring my individual worth in constant visibility. Implicit in my follower’s question “Where will you go?” is the resounding question “How will we know where you’ve gone?” Privacy is considered a small exchange for the security of being well known and well liked. 

     After all, a private life boasts no location markers or story updates. The idea that the happenings of our lives would be constrained to our immediate families, friends and real-life communities is akin to social death in a world measured by followers, views, likes and shares.

     I grow weary when I think of this as the new normal for what is considered to be a fruitful personal life. Social media is no longer a mere public extension of our private socialization; it has become a replacement for it. What happens to our humanity when we relegate our real lives to props for the performance of our virtual ones? 

     For one, a predominantly online existence can lull us into a dubious sense of having enacted concrete change, simply because of a tweet or Instagram post. As “hashtag activism” has obscured longstanding traditions of assembly and protest, there’s concern that a failure to transition from the keyboard to in-person organization will effectively stall or kill the momentum of political movements. (See: Occupy Wall Street.) 

     The sanctity of our most intimate experiences is also diminished. My grandfather Charles Shaw — a notable musician whose wisdoms and jazz scene tales I often shared on Twitter — passed away last year. Rather than take adequate time to privately mourn the loss of his giant influence in my life alongside those who loved him most, I quickly posted a lengthy tribute to him to my followers. At the time I thought, “How will they remember him if I don’t acknowledge his passing?”

     Perhaps at the root of this anxiety over being forgotten is an urgent question of how one ought to form a legacy; with the rise of automation, a widening wealth gap and an unstable political climate, it is easy to feel unimportant. It is almost as if the world is too big and we are much too small to excel in it in any meaningful way. We feel we need as many people as possible to witness our lives, so as not to be left out of a story that is being written too fast by people much more significant than ourselves.

     “The secret of a full life is to live and relate to others as if they might not be there tomorrow, as if you might not be there tomorrow,” the writer Anais Nin said. “This feeling has become a rarity, and rarer every day now that we have reached a hastier and more superficial rhythm, now that we believe we are in touch with a greater amount of people. This is the illusion which might cheat us of being in touch deeply with the one breathing next to us.”

     I think of those words and at once any fear of obscurity is eclipsed by much deeper ones — the fear of forgoing the sacred moments of life, of never learning to be completely alone, of not bearing witness to the incredible lives of those who surround me.

     I observe the world around me. It is big and moving fast. “What’s happening?” I think to myself. 

     I’m just beginning to find out. 


From:www.nytimes.com/Oct. 1, 2019

The author states that for millennials, social media has become so much part of their lives that somehow it comes to be
Alternativas
Ano: 2019 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2019 - UECE - Vestibular - Língua Inglesa |
Q1280154 Inglês

T E X T


I Used to Fear Being a Nobody. Then I Left

Social Media.


By Bianca Brooks


“What’s happening?”

     I stare blankly at the little box as I try to think of something clever for my first tweet. I settle on what’s at the top of my mind: “My only #fear is being a nobody.” How could I know this exchange would begin a dialogue that would continue nearly every day for the next nine years of my life?

     I began using Twitter in 2010 as a newly minted high school freshman. Though it began as a hub for my quirky adolescent thoughts, over the years it became an archive of my emotional and intellectual voice — a kind of virtual display for the evolution of my politics and artistic identity. But after nine years, it was time to close the archive. My wanting to share my every waking thought became eclipsed by a desire for an increasingly rare commodity — a private life.

     Though I thought disappearing from social media would be as simple as logging off, my refusal to post anything caused a bit of a stir among my small but loyal following. I began to receive emails from strangers asking me where I had gone and when I would return. One message read: “Not to be over familiar, but you have to come back eventually. You’re a writer after all. How will we read your writing?” Another follower inquired, “Where will you go?” 

     The truth is I have not gone anywhere. I am, in fact, more present than ever

     Over time, I have begun to sense these messages reveal more than a lack of respect for privacy. I realize that to many millennials, a life without a social media presence is not simply a private life; it is no life at all: We possess a widespread, genuine fear of obscurity.

     When I consider the near-decade I have spent on social media, this worry makes sense. As with many in my generation, Twitter was my entry into conversations happening on a global scale; long before my byline graced any publication, tweeting was how I felt a part of the world. Twitter functions much like an echo chamber dependent on likes and retweets, and gaining notoriety is as easy as finding someone to agree with you. For years I poured my opinions, musings and outrage onto my timeline, believing I held an indispensable place in a vital sociopolitical experiment. 

     But these passionate, public observations were born of more than just a desire to speak my mind — I was measuring my individual worth in constant visibility. Implicit in my follower’s question “Where will you go?” is the resounding question “How will we know where you’ve gone?” Privacy is considered a small exchange for the security of being well known and well liked. 

     After all, a private life boasts no location markers or story updates. The idea that the happenings of our lives would be constrained to our immediate families, friends and real-life communities is akin to social death in a world measured by followers, views, likes and shares.

     I grow weary when I think of this as the new normal for what is considered to be a fruitful personal life. Social media is no longer a mere public extension of our private socialization; it has become a replacement for it. What happens to our humanity when we relegate our real lives to props for the performance of our virtual ones? 

     For one, a predominantly online existence can lull us into a dubious sense of having enacted concrete change, simply because of a tweet or Instagram post. As “hashtag activism” has obscured longstanding traditions of assembly and protest, there’s concern that a failure to transition from the keyboard to in-person organization will effectively stall or kill the momentum of political movements. (See: Occupy Wall Street.) 

     The sanctity of our most intimate experiences is also diminished. My grandfather Charles Shaw — a notable musician whose wisdoms and jazz scene tales I often shared on Twitter — passed away last year. Rather than take adequate time to privately mourn the loss of his giant influence in my life alongside those who loved him most, I quickly posted a lengthy tribute to him to my followers. At the time I thought, “How will they remember him if I don’t acknowledge his passing?”

     Perhaps at the root of this anxiety over being forgotten is an urgent question of how one ought to form a legacy; with the rise of automation, a widening wealth gap and an unstable political climate, it is easy to feel unimportant. It is almost as if the world is too big and we are much too small to excel in it in any meaningful way. We feel we need as many people as possible to witness our lives, so as not to be left out of a story that is being written too fast by people much more significant than ourselves.

     “The secret of a full life is to live and relate to others as if they might not be there tomorrow, as if you might not be there tomorrow,” the writer Anais Nin said. “This feeling has become a rarity, and rarer every day now that we have reached a hastier and more superficial rhythm, now that we believe we are in touch with a greater amount of people. This is the illusion which might cheat us of being in touch deeply with the one breathing next to us.”

     I think of those words and at once any fear of obscurity is eclipsed by much deeper ones — the fear of forgoing the sacred moments of life, of never learning to be completely alone, of not bearing witness to the incredible lives of those who surround me.

     I observe the world around me. It is big and moving fast. “What’s happening?” I think to myself. 

     I’m just beginning to find out. 


From:www.nytimes.com/Oct. 1, 2019

The author was actively involved with social media for
Alternativas
Ano: 2014 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2014 - UECE - Vestibular - Língua Inglesa - 1ª fase |
Q1280001 Inglês

TEXT

    For a $14.95 monthly membership, the website Lumosity promises to “train” your brain with games designed to stave off mental decline. Users view a quick succession of bird images and numbers to test attention span, for instance, or match increasingly complex tile patterns to challenge memory.

    While Lumosity is perhaps the best known of the brain-game websites, with 50 million subscribers in 180 countries, the cognitive training business is booming. Happy Neuron of Mountain View, Calif., promises “brain fitness for life.” Cogmed, owned by the British education company Pearson, says its training program will give students “improved attention and capacity for learning.” The Israeli firm NeurMuch of the focus of theonix is developing a brain stimulation and cognitive training program that the company calls a “new hope for Alzheimer’s disease.”

    Much of the focus of the brain fitness business has been on helping children with attention-deficit problems, and on improving cognitive function and academic performance in healthy children and adults. An effective way to stave off memory loss or prevent Alzheimer’s — particularly

    The problem, Dr. Doraiswa if it were a simple website or video game — is the “holy grail” of neuroscience, said Dr. Murali Doraiswamy, director of the neurocognitive disorders program at Duke Institute for Brain Sciences. The problem, Dr. Doraiswamy added, is that the science of cognitive training has not kept up with the hype. “Almost all the marketing claims made by all the companies go beyond the data,” he said. “We need large national studies before you can conclude that it’s ready for prime time.”

    For centuries, scientists believed that most brain development occurred in the first few years of life — that by adulthood the brain was largely immutable. But over the past two decades, studies on animals and humans have found that the brain continues to form new neural connections throughout life. But questions remain whether an intervention that challenges the brain — a puzzle, studying a new language or improving skill on a video game — can really raise intelligence or stave off normal memory loss.

    A series of studies in recent years has suggested that certain types of game training can improve a person’s cognitive performance. In February 2013, however, an analysis of 23 of the best studies on brain training, led by the University of Oslo researcher Monica Melby-Lervag, concluded that while players do get better, the increase in skill hasn’t been shown to transfer to other tasks. In other words, playing Sudoku or an online matching game makes you better at the game, but it doesn’t make you better at math or help you remember names or where you left your car keys.

    But other studies have been more encouraging. Last September, the journal Nature published a study by researchers at the University of California, San Francisco, that showed a driving game did improve short-term memory and longterm focus in older adults. The findings are significant because the research found that improvements in performance weren’t limited to the game, but also appeared to be linked to a strengthening of older brains over all, helping them to perform better at other memory and attention tasks.

    In addition, brain monitoring during the study showed that in older participants, game training led to bursts in brain waves associated with attention; the patterns were similar to those seen in much younger brains.

    Earlier this year, the National Institutes of Health invited applications to more rigorously test brain fitness training to stave off cognitive decline. Researchers say they hope the effort will help establish a consistent standard for determining whether a brain-training intervention works.

    But while the science remains unclear, entrepreneurs have seized on what is likely to be a sizable marketing opportunity. In May, hundreds of researchers and businesses will gather in San Francisco for the NeuroGaming Conference and Expo to explore the latest research and the newest technology.

    While there is no real risk to participating in the many unproven brain-training games available online and through smartphones, experts say, consumers should know that the scientific jury is still out on whether they are really boosting brain health or just paying hundreds of dollars to get better at a game.

    “I’m not convinced there is a huge difference between buying a $300 subscription to a gaming company versus you yourself doing challenging things on your own, like attending a lecture or learning an instrument,” Dr. Doraiswamy said. “Each person has to personalize for themselves what they find fun and challenging and what they can stick with.”

From: www.nytimes.com, March 10, 2014

According to a study conducted by Monica Melby-Lervag at the University of Oslo, applying oneself to an activity such as playing Sudoku
Alternativas
Ano: 2014 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2014 - UECE - Vestibular - Língua Inglesa - 1ª fase |
Q1280000 Inglês

TEXT

    For a $14.95 monthly membership, the website Lumosity promises to “train” your brain with games designed to stave off mental decline. Users view a quick succession of bird images and numbers to test attention span, for instance, or match increasingly complex tile patterns to challenge memory.

    While Lumosity is perhaps the best known of the brain-game websites, with 50 million subscribers in 180 countries, the cognitive training business is booming. Happy Neuron of Mountain View, Calif., promises “brain fitness for life.” Cogmed, owned by the British education company Pearson, says its training program will give students “improved attention and capacity for learning.” The Israeli firm NeurMuch of the focus of theonix is developing a brain stimulation and cognitive training program that the company calls a “new hope for Alzheimer’s disease.”

    Much of the focus of the brain fitness business has been on helping children with attention-deficit problems, and on improving cognitive function and academic performance in healthy children and adults. An effective way to stave off memory loss or prevent Alzheimer’s — particularly

    The problem, Dr. Doraiswa if it were a simple website or video game — is the “holy grail” of neuroscience, said Dr. Murali Doraiswamy, director of the neurocognitive disorders program at Duke Institute for Brain Sciences. The problem, Dr. Doraiswamy added, is that the science of cognitive training has not kept up with the hype. “Almost all the marketing claims made by all the companies go beyond the data,” he said. “We need large national studies before you can conclude that it’s ready for prime time.”

    For centuries, scientists believed that most brain development occurred in the first few years of life — that by adulthood the brain was largely immutable. But over the past two decades, studies on animals and humans have found that the brain continues to form new neural connections throughout life. But questions remain whether an intervention that challenges the brain — a puzzle, studying a new language or improving skill on a video game — can really raise intelligence or stave off normal memory loss.

    A series of studies in recent years has suggested that certain types of game training can improve a person’s cognitive performance. In February 2013, however, an analysis of 23 of the best studies on brain training, led by the University of Oslo researcher Monica Melby-Lervag, concluded that while players do get better, the increase in skill hasn’t been shown to transfer to other tasks. In other words, playing Sudoku or an online matching game makes you better at the game, but it doesn’t make you better at math or help you remember names or where you left your car keys.

    But other studies have been more encouraging. Last September, the journal Nature published a study by researchers at the University of California, San Francisco, that showed a driving game did improve short-term memory and longterm focus in older adults. The findings are significant because the research found that improvements in performance weren’t limited to the game, but also appeared to be linked to a strengthening of older brains over all, helping them to perform better at other memory and attention tasks.

    In addition, brain monitoring during the study showed that in older participants, game training led to bursts in brain waves associated with attention; the patterns were similar to those seen in much younger brains.

    Earlier this year, the National Institutes of Health invited applications to more rigorously test brain fitness training to stave off cognitive decline. Researchers say they hope the effort will help establish a consistent standard for determining whether a brain-training intervention works.

    But while the science remains unclear, entrepreneurs have seized on what is likely to be a sizable marketing opportunity. In May, hundreds of researchers and businesses will gather in San Francisco for the NeuroGaming Conference and Expo to explore the latest research and the newest technology.

    While there is no real risk to participating in the many unproven brain-training games available online and through smartphones, experts say, consumers should know that the scientific jury is still out on whether they are really boosting brain health or just paying hundreds of dollars to get better at a game.

    “I’m not convinced there is a huge difference between buying a $300 subscription to a gaming company versus you yourself doing challenging things on your own, like attending a lecture or learning an instrument,” Dr. Doraiswamy said. “Each person has to personalize for themselves what they find fun and challenging and what they can stick with.”

From: www.nytimes.com, March 10, 2014

While scientific research and brain gaming companies are not working with the same assumptions, Dr. Doraiswamy recommends that people
Alternativas
Ano: 2014 Banca: UECE-CEV Órgão: UECE Prova: UECE-CEV - 2014 - UECE - Vestibular - Língua Inglesa - 1ª fase |
Q1279999 Inglês

TEXT

    For a $14.95 monthly membership, the website Lumosity promises to “train” your brain with games designed to stave off mental decline. Users view a quick succession of bird images and numbers to test attention span, for instance, or match increasingly complex tile patterns to challenge memory.

    While Lumosity is perhaps the best known of the brain-game websites, with 50 million subscribers in 180 countries, the cognitive training business is booming. Happy Neuron of Mountain View, Calif., promises “brain fitness for life.” Cogmed, owned by the British education company Pearson, says its training program will give students “improved attention and capacity for learning.” The Israeli firm NeurMuch of the focus of theonix is developing a brain stimulation and cognitive training program that the company calls a “new hope for Alzheimer’s disease.”

    Much of the focus of the brain fitness business has been on helping children with attention-deficit problems, and on improving cognitive function and academic performance in healthy children and adults. An effective way to stave off memory loss or prevent Alzheimer’s — particularly

    The problem, Dr. Doraiswa if it were a simple website or video game — is the “holy grail” of neuroscience, said Dr. Murali Doraiswamy, director of the neurocognitive disorders program at Duke Institute for Brain Sciences. The problem, Dr. Doraiswamy added, is that the science of cognitive training has not kept up with the hype. “Almost all the marketing claims made by all the companies go beyond the data,” he said. “We need large national studies before you can conclude that it’s ready for prime time.”

    For centuries, scientists believed that most brain development occurred in the first few years of life — that by adulthood the brain was largely immutable. But over the past two decades, studies on animals and humans have found that the brain continues to form new neural connections throughout life. But questions remain whether an intervention that challenges the brain — a puzzle, studying a new language or improving skill on a video game — can really raise intelligence or stave off normal memory loss.

    A series of studies in recent years has suggested that certain types of game training can improve a person’s cognitive performance. In February 2013, however, an analysis of 23 of the best studies on brain training, led by the University of Oslo researcher Monica Melby-Lervag, concluded that while players do get better, the increase in skill hasn’t been shown to transfer to other tasks. In other words, playing Sudoku or an online matching game makes you better at the game, but it doesn’t make you better at math or help you remember names or where you left your car keys.

    But other studies have been more encouraging. Last September, the journal Nature published a study by researchers at the University of California, San Francisco, that showed a driving game did improve short-term memory and longterm focus in older adults. The findings are significant because the research found that improvements in performance weren’t limited to the game, but also appeared to be linked to a strengthening of older brains over all, helping them to perform better at other memory and attention tasks.

    In addition, brain monitoring during the study showed that in older participants, game training led to bursts in brain waves associated with attention; the patterns were similar to those seen in much younger brains.

    Earlier this year, the National Institutes of Health invited applications to more rigorously test brain fitness training to stave off cognitive decline. Researchers say they hope the effort will help establish a consistent standard for determining whether a brain-training intervention works.

    But while the science remains unclear, entrepreneurs have seized on what is likely to be a sizable marketing opportunity. In May, hundreds of researchers and businesses will gather in San Francisco for the NeuroGaming Conference and Expo to explore the latest research and the newest technology.

    While there is no real risk to participating in the many unproven brain-training games available online and through smartphones, experts say, consumers should know that the scientific jury is still out on whether they are really boosting brain health or just paying hundreds of dollars to get better at a game.

    “I’m not convinced there is a huge difference between buying a $300 subscription to a gaming company versus you yourself doing challenging things on your own, like attending a lecture or learning an instrument,” Dr. Doraiswamy said. “Each person has to personalize for themselves what they find fun and challenging and what they can stick with.”

From: www.nytimes.com, March 10, 2014

Recent research undertaken with older adults at the University of California presented significant results in relation to cognitive training using a certain driving game. They are significant due to the evidence that
Alternativas
Respostas
2281: C
2282: A
2283: B
2284: B
2285: D
2286: A
2287: C
2288: C
2289: C
2290: A
2291: C
2292: B
2293: B
2294: D
2295: A
2296: C
2297: A
2298: A
2299: C
2300: C