• Wallabag.it! - Save to Instapaper - Save to Pocket -

    Weekend reads: 800 retractions from Russia; paying to publish in Vietnam; a retraction involving Facebook, political misinformation, and Teen Vogue

    Before we present this week’s Weekend Reads, a question: Do you enjoy our weekly roundup? If so, we could really use your help. Would you consider a tax-deductible donation to support Weekend Reads, and our daily work? Thanks in advance. The week at Retraction Watch featured: Crossfit being awarded $4 million in sanctions in a case … Continue reading Weekend reads: 800 retractions from Russia; paying to publish in Vietnam; a retraction involving Facebook, political misinformation, and Teen Vogue

    in Retraction watch on January 11, 2020 04:04 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    There are retractors in plastic surgery — and not just of the instrument kind

    A plastic surgeon in Turkey has notched his fifth retraction for plagiarism and other issues.  That makes him a retractor — even if most plastic surgeons would have something else in mind if they used that term. Ilteris Murat Emsen, then of the Department of Plastic, Reconstructive and Aesthetic Surgery at the Numune State Hospital … Continue reading There are retractors in plastic surgery — and not just of the instrument kind

    in Retraction watch on January 10, 2020 11:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    165: Magic as a Tool for Understanding the Brain

    This episode is an encore presentation of an interview with neuroscientists Stephen L. Macknik and Susana Martinez-Conde. We talk about their international bestseller "Sleights of Mind: What the Neuroscience of Magic Reveals about Our Everyday Deceptions." Macknik and Martinez-Conde are neuroscientists who study vision, but several years ago they had the innovative idea of collaborating with magicians to explore how their use of both visual and cognitive illusions reveals secrets about how our brains work.

    This may sound esoteric, but it has practical consequences, especially for making sound decisions in our complex world.

    I will be back in 2 weeks with a new interview with Stephen Macknik.

    Links and References:

    Please Visit Our Sponsors:

    BetterHelp at http://betterhelp.com/ginger

    TextExpander at TextExpander.com/podcast Announcements:

     

    Connect on Social Media:

    Contact Dr. Campbell:

    • Email: brainsciencepodcast@gmail.com
    • Voicemail: http://speakpipe.com/docartemis

    in Brain Science with Ginger Campbell, MD: Neuroscience for Everyone on January 10, 2020 10:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    The Emotions Of Music And The Meaning Of Life: The Week’s Best Psychology Links

    Our weekly round-up of the best psychology coverage from elsewhere on the web

    It’s hard for researchers to study the brain activity involved in social interactions when they can only conduct MRI scans on a single person at a time. But what if you could squeeze two people into the scanner at once? At Science, Kelly Servick reports on the development of new, rather intimate imaging arrangements, in which two participants lie face-to-face while having their brains scanned simultaneously.  


    Music can make us feel a range of emotions — but are those experiences common to everyone, or specific to our own cultural groups? A new study has identified “13 distinct and very specific feelings”, such as feelings of triumph or awe, that were shared by both Chinese and American participants when listening to music, reports David Noonan at Scientific American. This suggests that at least some of our music-induced emotional experiences are universal.


    After an exhausting 2019, many of us may be aiming to work on our own well-being this year. But what are the most effective strategies for becoming better, happier people? David Robson has the answers at The Guardian. (And if you’re after more psychology-informed tips for maintaining your new year’s resolutions, check back with Research Digest early next week for a whole feature on the topic).


    An Iron Age skull discovered in the UK in 2008 contained brain matter that had somehow been preserved for 2,600 years. Now scientists think they know why, writes George Dvorsky at Gizmodo. Researchers found that the remarkable preservation was due to the way proteins had become tightly packed in the brain material, preventing the normal decay process.


    “We are not just ghostly entities living inside machine-like bodies in an indifferent world. Human life is not a meaningless space between birth and death, spent trying to enjoy ourselves and forget about our predicament. I believe that human life and the world mean much more than that.” At The Conversation, psychologist Steve Taylor explores what his own research has taught him about the meaning of life.


    How did researchers recently test whether cuttlefish see in three dimensions? By giving them cute 3D glasses and getting them to watch 3D movies, of course. The team found that — contrary to what scientists had previously believed — the creatures do indeed have 3D vision, and use their depth perception for hunting, writes Veronique Greenwood at The New York Times. Check out the article for plenty of videos of the bespectacled cephalopods.


    We tend to take the positive parts of life in our stride, but we can end up really fixated on the bad ones. Why do we find negative experiences so salient? In The Guardian’s ‘Science Weekly’ podcast, Ian Sample interviews psychologist Roy Baumeister on the “power of negativity” — and how to avoid focusing too much on the bad things in life.

    Compiled by Matthew Warren (@MattbWarren), Editor of BPS Research Digest

    in The British Psychological Society - Research Digest on January 10, 2020 09:45 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    “The father is just as upset”: why we need multi-level strategies to support men when a baby dies

    Expectant parents do not anticipate birth and death to collide. However, for millions of families worldwide each year, the death of their baby due to miscarriage, stillbirth, or neonatal death within the first 28 days of life, is a devastating reality.

    Historically, parents’ grief over these losses remained unacknowledged. Over the past few decades, research into the emotional burden of grief has resulted in the development of sensitive healthcare and psychosocial support for families. However, bereaved parents continue to report stigma and societal norms which can minimize their loss.

    Men and baby loss

    Men are increasingly involved in pregnancy and childbirth. Nevertheless, what we know about men’s grief when a baby dies before or shortly after birth remains somewhat limited. Most previous research and subsequent bereavement care guidelines focus on the experiences of heterosexual women, despite men reporting feeling overlooked or ‘forgotten’ as grieving fathers. As one father in our earlier research described:

    “Every time I’d get a text from my friends the text was like how’s [female partner]? […] but you know, the father is just as upset even though he doesn’t necessarily show it in the same way” (John, father to stillborn baby)

    Therefore, to develop a more-comprehensive understanding of men’s grief following pregnancy/neonatal loss, our recent systematic review, published in BMC Geriatrics, summarized existing research on how men experience grief and the factors which contribute to men’s grief.

    ‘It’s complicated’

    Our searches identified 46 relevant studies. Grief was highly individual, however, men do seem to face specific challenges which may complicate or delay their grief, and differ from the experiences of women.

    We need wider strategies and policies that recognise men as grieving fathers.

    For example, there was a general trend towards more problem-focused coping strategies such as ‘keeping busy’ or using distractions including activities or work. Feelings of helplessness and powerlessness were also common, along with responsibilities to care for other children, complete paperwork, organize a funeral/burial, and inform family and friends of the loss. Importantly, a tension also existed between expressing their grief and holding it in or “being strong” to support their female partner and family.

    In studies comparing men and women, men generally scored lower on grief measures than women. Importantly, however, the way we currently measure grief may not capture the complexities of men’s grief. Several studies highlighted that commonly used measures may not be sensitive to differences in styles of grief that men may exhibit (e.g., ‘doing’ versus ‘feeling’).

    Four levels

    The factors affecting grief varied across studies. Broadly, we grouped these factors into four different levels: individual, interpersonal, community, and public policy.

    At the individual level, factors included men’s personality, their demographic background, and their attachment to the unborn or newborn baby. In general, men who had developed a strong bond with their baby experienced more intense grief.

    At the interpersonal level, support and acknowledgement from family, friends, and healthcare professionals were critical in determining how well men could express and navigate their grief. A lack of recognition or support led to feelings of isolation which worsened grief.

    At the community level, cultural norms relating to grief and masculinity ideals often led men to feel as though they needed to be the “strong one” or “supporter” to their female partner, rather than a grieving father. This role often took precedence over men’s own needs.

    Finally, policies relating to bereavement leave in the workplace, and the woman-centered nature of pregnancy and maternity care, had the potential to contribute to a sense of isolation for men. Three studies identified that a lack of bereavement leave options for men impacted upon how well they were able to work through their grief, and in seven studies, a focus solely on women’s health, especially in the hospital environment, left many men feeling removed and unsupported.

    A broader concept of grief

    Given this range of factors that appear to affect grief, we have proposed a socio-ecological model of men’s grief, which recognizes that grief does not exist in isolation. Rather, it is shaped by a complex system of interacting factors and levels. Without strategies to address factors at each level, we risk increasing men’s isolation and decreasing both perceived and real access to support.

    from: Obst, et al. 2020

    While individual grief counselling and support groups are important, we also need wider strategies and policies that recognise men as grieving fathers. Suggestions include hospital programs that genuinely engage with fathers and promote an equal partnership throughout pregnancy and childbirth; campaigns to raise awareness of the impact of pregnancy/neonatal loss on families, including men; and evaluating current workplace policies for bereavement leave, to afford men equal opportunity to mourn their loss and access appropriate psychosocial supports.

    The post “The father is just as upset”: why we need multi-level strategies to support men when a baby dies appeared first on BMC Series blog.

    in BMC Series blog on January 10, 2020 09:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Highlights from the American Society of Nephrology Kidney Week 2019

    In November, as the Editor of BMC Nephrology, I had the pleasure of attending the Kidney Week Annual Meeting. Organized by the American Society of Nephrology (ASN). The four-day meeting was attended by more than 14,000 international delegates and hosted at the vast Walter E. Washington Convention Center in downtown Washington D.C.

    Inside the atrium of the Walter E. Washington Convention Centre
    Inside the atrium of the Walter E. Washington Convention Centre

    Kidney Week began with 10 early programs, followed by an Annual Meeting that was packed with plenary talks, clinical sessions, educational symposia, and poster sessions. Organized around the sessions were meetings with BMC Nephrology Editorial Board Members who kindly agreed to meet with me.

    Each day started with a series of state-of-the-art lectures including:

    • Prof. Jennifer Doudna, University of California, Berkeley,  on “Rewriting the code of life: the future of genome editing”
    • Nobel Prize Laureate Prof. Michael Young, Rockefeller University, on “Genes controlling sleep and circadian rhythms”
    • Segway™ inventor Dean Kamen (DEKA Research & Development Corp.) and nephrologists Dr. Bruce Culleton (CVS Health), and Dr. Tod Ibrahim (American Society of Nephrology) on “Perspectives on innovation and transformation in kidney care”
    • Dr. Danielle Ofri, NYU School of Medicine, on “What patients say, what doctors hear”

    The daily programs were crammed with cutting-edge research covering all areas of nephrology and its crossovers with other areas of medicine. Here are some of my highlights.

    State-of-the-art lecture: Innovation and Transformation in Kidney Care

    Dr. Tod Ibrahim speaking at the plenary session
    Dr. Tod Ibrahim speaking at the plenary session.
    Image Source: ASN Advocacy

    “Seventy-six years later, we carry a more powerful computer in our pocket, yet mostly rely on the same technology to treat kidney failure in in-center dialysis”Tod Ibrahim

    A highlight was the state-of-the-art lecture from inventor Dean Kamen and nephrologists Dr. Bruce Culleton and Dr. Tod Ibrahim, who contextualized the issue of poor health outcomes experienced globally by more than 850 million renal patients; racial inequalities in access to home dialysis; and the discard of usable kidney allografts while patients are dying on transplant waiting lists. Progress to resolve these issues in the field seems to have stalled, with Ibrahim naming underfunding from the government and investors as a contributor.

    However, the overall feeling was optimistic. The speakers were enthusiastic about how technology could revolutionize future nephrology care, and how initiatives such as Advancing American Kidney Health could help bring investment into future technologies. Furthermore, the setup of the US Advanced Regenerative Manufacturing Institute has brought the development of a system to induce pluripotent stem cells to generate segments of bone and ligament within 40 days. This begs the question: how can this area of developmental biology influence the manufacture of kidneys and other organs for transplantation?

    Clinical Practice Session: Pregnancy Outcomes and Kidney Disease

    © Wavebreakmedia / Getty Images / iStock

    I attended BMC Nephrology Section Editor, Dr. Giorgina Piccoli‘s (University of Torino, Italy) talk on ‘Reproductive health in women with chronic kidney disease and kidney transplantation’. The risks that women experiencing kidney disease face are worrying, with patients on dialysis and with chronic kidney disease at higher risk of experiencing pre-eclampsia and premature parturition, amongst other risks, compared to the rest of the population. However, it was encouraging and inspiring to hear how Dr. Piccoli approaches these kinds of scenarios and how she values contraception and sexual health counseling and open communication with her female patients, to understand her patient’s wants and needs. Dr. Michelle Hladunewich, Sunnybrook Research Institute, Canada, offered a positive outlook for the future for women on dialysis who hoped for a successful pregnancy. Dr. Hladunewich explained that when a patient falls pregnant whilst being treated with dialysis, the risk of premature delivery can be lowered with certain interventions such as careful attention to clearance rates and induction of labor. With Canada being one of the world’s leaders in live birth rates for women on hemodialysis, perhaps there are some important lessons to be learned here?

     

    ASN’s next Kidney Week meeting will take place in Denver, Colorado on the 20th-25th October 2020. The BMC Nephrology team certainly look forward to attending future Kidney Week meetings.

    ASN’s next Kidney Week meeting will take place in Denver, Colorado on the 20th-25th October 2020.

    The post Highlights from the American Society of Nephrology Kidney Week 2019 appeared first on BMC Series blog.

    in BMC Series blog on January 10, 2020 08:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Milton Prabu's outsourced art gallery

    "now we have equipped with our own photo microscopy, we didn't depend on the out sourcing fellows to take the photography our slides". - Dr Milton Prabu.

    in For Better Science on January 09, 2020 01:03 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    First-Generation University Students Are At Greater Risk Of Experiencing Imposter Syndrome

    GettyImages-157331738.jpgBy Emily Reynolds

    Increasing efforts have been made in recent years to encourage students to pursue STEM (science, technology, engineering and maths) subjects. There’s been a particularly positive emphasis on getting a more diverse group of people onto such courses: women, black and ethnic minority groups and working class people have all been the focus of drives and campaigns designed to help them enter STEM careers.

    But, a new study suggests, the competitive nature of STEM courses may be having a knock-on effect on the confidence of certain students, in this case first-generation college attendees (those who are the first in their family to go to university). Such students, the paper argues, are more likely to experience “imposter syndrome” — the feeling that they don’t belong or don’t have the skills or intelligence to continue on their studies — precisely because of this atmosphere of competition.

    In such environments, previous research has shown, students are more likely to compare themselves (often unfavourably) to others. When we feel our peers are our adversaries, rather than colleagues or comrades, we look to their successes and failures to judge ourselves: often, we believe we fall short, and our confidence falters.

    In first-generation students, the paper argues, this can be even more damaging. First-generation students are often raised with communal values, relying on other people rather than seeing them as rivals. When this meets the competitive, individualistic world of STEM courses, it can have a particularly detrimental impact.

    To study the impact of competition on first-generation college attendees, researchers enlisted 818 freshmen and sophomores enrolled in STEM courses at a large U.S. university. Participants were first asked to complete a survey, once at the beginning of term and once after the deadline to drop courses, measuring perceptions of classroom competition; participants rated statements such as “the professor seems to pit students against each other in a competitive manner in this class” on a scale of one to seven. Demographic data was also collected during these surveys, including information on whether participants were first-generation students.

    Six weeks into term, students were sent further surveys to complete daily, asking whether or not they had been attending class. Those who had been attending were asked to explore imposter feelings, rating statements like “in class, I feel like people might find out I am not as capable as they think I am” on a scale of one to six; those who had not been attending were asked to explain why. The team also recorded how engaged students felt, how often they attended class, how much they thought about dropping out, and their grades.

    As anticipated, those who felt classes were competitive were far more likely to feel as if they were an imposter, unable to keep up with the demands of their course. And compared to those with family members who had gone to university, first-generation students were more likely to experience feelings of imposter syndrome on a daily basis — but only in classes perceived to have high levels of competition. In non-competitive environments, imposter feelings were equal in both first-generation and continued generation students, suggesting that the atmosphere of the classroom really is a key driver.

    By increasing their imposter feelings, the students’ perceptions of classroom competition also had a negative impact on their achievement, reducing engagement, attendance, and performance, and increasing dropout intentions. This effect was much greater amongst first-generation students

    The team do note that repeatedly seeing questions about imposter syndrome may in fact have triggered those feelings: although measures were limited to once per day in the second part of the study, contemplating competition and achievement may in fact have enhanced feelings of insecurity or inadequacy.

    How other identities intersect with the phenomenon was also left unaddressed. Women and people of colour are both more susceptible to imposter syndrome, for example, and exploring how such identities interact with one another could be a focus of future research.

    Creating a welcoming, supportive environment for everybody to study STEM subjects, no matter their background, is key to a diverse and inclusive field. Understanding more about how students of different backgrounds experience STEM studies and actively developing strategies to counter inequalities are both vital steps towards making sure this happens.

    Feeling Like an Imposter: The Effect of Perceived Classroom Competition on the Daily Psychological Experiences of First-Generation College Students

    Emily Reynolds (@rey_z) is a staff writer at BPS Research Digest

     

    in The British Psychological Society - Research Digest on January 09, 2020 11:41 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Two retractions prove fresh is best

    To the list of best practices in experimental research, here’s another item to add: Always use fresh cheesecloth when separating biomasses.  We’ll explain. A group of researchers at Kansas State University have retracted their 2018 paper, titled “Corn stover pretreatment by metal oxides for improving lignin removal and reducing sugar degradation and water usage,” in … Continue reading Two retractions prove fresh is best

    in Retraction watch on January 09, 2020 11:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    WHO formally retracts opioid guidelines that came under fire

    The World Health Organization has officially retracted its controversial guidelines on the use of opioid analgesics.  The agency’s move applies to two statements, issued in 2011 and 2012. Last June, WHO announced that it was “discontinuing” the guidelines in the wake of a critical report which said the documents were heavily tainted by commercial bias. … Continue reading WHO formally retracts opioid guidelines that came under fire

    in Retraction watch on January 08, 2020 05:30 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Public Belief In “Memory Myths” Not Actually That Widespread, Study Argues

    GettyImages-668658382.jpgBy Emma Young

    The general public has a pretty poor understanding of how memory works — and lawyers and clinical psychologists can be just as bad. At least, this is what many researchers have asserted, notes a team at University College London in a new paper, published in the Journal of Experimental Psychology. However, their research reveals that the idea that most people ignorantly subscribe to “memory myths” is itself a myth.

    The wording of earlier studies, and also discrepancies in how memory experts and the general public tend to interpret the meaning of statements about memory, have painted a bleaker picture of public understanding than is actually the case, according to a series of studies led by Chris Brewin. This has important implications for cases in which ideas about memory are highly relevant — among jurors in a court room, for example.

    The researchers first explored one of the “50 great myths of popular psychology”: that memory is like a video camera.

    The strongest support for this “great myth” comes from a 2011 nationally representative phone survey in the US, in which 24% of those surveyed “strongly agreed” and 39% “mostly agreed” with the statement (the precise wording is important): “Human memory works like a video camera, accurately recording the events we see and hear so that we can review and inspect them later.” Memory experts all disagreed. The case against the public, it seemed, was closed.

    But this study only asked about the video camera analogy and didn’t ask whether they thought it was ‘like’ anything else. And perhaps, Brewin and his team wondered, experts and lay people might make different assumptions about what it implies, with experts taking the video camera analogy to mean that everything we see and hear is faithfully and fully recorded, whereas the participants might have thought that our memories store some scenes and interactions accurately, quite (if not entirely) like a video camera.

    Their study of 172 young people from 27 different countries found that the video camera analogy was highly endorsed, but so too were statements likening memory to a “library”, a “diary entry”, and “rooms in a house”. And when the video camera analogy was explicitly linked to the assumption that events could be played back exactly (my italics) as they happened, more than 70 percent disagreed with it.

    A subsequent study of 200 US adults found that people did report experiencing certain memories as a series of scenes “like a videotape”. These included memories of a son’s first birthday, for example, but also of everyday events like conversations with friends. This subjective experience was an important reason why two thirds of this sample endorsed the statement that memory was “partly” like a video camera.

    The vast majority also agreed that all kinds of factors can affect a memory’s accuracy, and that some memories turn out to be mistaken. In fact, they were more likely to agree with the idea that memories are often mistaken, and have the potential to completely decay, than with statements asserting that they are highly accurate, or permanent. “In particular, statements consistent with scientific consensus that memory is neither complete (94%) nor passive (93%) were endorsed with very little disagreement,” the team reports.

    A further study found that most lay people agreed with something that has been considered a myth, but which recent evidence has supported: that, under certain circumstances, the more confidence an eyewitness reports in a memory, the more accurate it is.

    This groups’ attitudes to “repressed memories” was also explored. In contract to experimental psychologists, lay people and some clinical psychologists have tended to endorse the idea that traumatic memories can be repressed for many years and then recovered. This is often put forward as another example of belief in a memory myth.

    But Brewin and his team found that, again, differences in interpretation of the terminology, rather than differences in ideas, seemed to underpin the discrepancy. The lay participants and professional psychologists tended to think of deliberate, rather than unconscious, “repression” of traumatic events. And while the idea that memories can be unconsciously repressed is certainly controversial, deliberate burying of upsetting memories is not.

    Clearly, this new research doesn’t imply that the general public has a perfect understanding of memory. “Nevertheless, the findings suggest that non-expert opinions about memory may be more closely aligned with the evidence than has hitherto been appreciated,” the team writes.

    This is important for legal and clinical contexts, they add. For example, earlier video camera results shouldn’t be taken to indicate that jurors don’t understand how memory works — or that testimony from experts about how memory works is essential in a trial.

    Is the public understanding of memory prone to widespread “myths”?

    Emma Young (@EmmaELYoung) is a staff writer at BPS Research Digest

     

    in The British Psychological Society - Research Digest on January 08, 2020 10:49 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    The eyes of scallops

    https://upload.wikimedia.org/wikipedia/commons/0/09/Argopecten_irradians.jpgA scallop. Each of the blue dots lining the shell is an eye. Source: Wikipedia.

    Scallops are a family of bivalves. These modest saltwater clams often end up on seafood dinner plates, but did you know that scallops have dozens of image-forming eyes? They focus light onto a multi-layered retina through a telescope-like parabolic mirror. Their super-sensitive visual system, which contains multiple types of opsins typical of both invertebrates and vertebrates, allows them to detect predators from afar and swim away to safety. No wonder they have survived and thrived for hundreds of millions of years! Follow me in this journey through one of the most interesting visual systems in the animal kingdom.

    The eyes

    Macro photo of a scallop’s eyes. Source: Wikipedia.

    Scallops have up to 200 individual eyes about 1 mm across arranged along the edge of their mantle. When scallops grow, new eyes sprout in locations where there are fewer eyes. These eyes can regenerate within about 40 days when damaged, recapitulating their initial growth.

    The eyes have an unusual optical path compared to most vertebrates and invertebrates, using reflection as the primary focusing mechanism. The light goes through a cornea and a lens, as in humans, but is then reflected by a mirror-like layer in the back of the eye.

    Schematic light path in the scallop eye. From Fernald et al. (2006).

    Guanine crystals carefully aligned in the back of the eye act as a photonic material, reflecting light maximally around 500 nm wavelength. This layer of crystals is curved like a parabolic mirror, focusing light primarily on a double-layered retina located about three quarters of the way into the eye.

    Guanine crystals form a set of reflective square tiles in the back of the eyes. From Palmer et al. (2017).

    This is functionally similar to a telescope with a parabolic mirror, with a few twists. One twist is that the lens and the mirror are slightly tilted with respect to one another, which means that the image is in focus at different distances depending on the position within the retina, giving the eye multiple focal lengths. A second twist is that scallop eyes have pupils that can contract by up to 50%, decreasing their sensitivity but increasing their spatial resolution. Overall, these eyes give scallop eyes a spatial resolution of roughly 2 degrees, enviable compared to, say, the common mouse.

    A scallop pupil slowly contracting. From Miller et al. (2019).

    The retinas and the evolution of vision

    Section of a scallop eye (left) with the different subsections (right). From Speiser et al. (2011).

    Scallop eyes have two retinas, the proximal and the distal retina, at different distances from the mirror at the back of the eye. These retinas have led to one of the most fundamental rethinking of the evolution of opsins (light-sensing proteins) and vision. The textbook story used to go that:

    From this observation it was easy to conclude that eyes evolved independently in vertebrates and invertebrates. An early crack in this tidy story of vertebrate vs. invertebrate eyes was the discovery of two different layers in the scallop retina. The proximal retina shows ON-responses (depolarizes) while the distal retina has OFF-responses (hyperpolarizes in response to light). It’s like there’s two different evolutionary pathways (vertebrate and invertebrate) in the same eye!

    Prototypical vertebrate and invertebrate signal transduction pathways. From Fernald et al. (2007).

    Functionally, the two types of layers seem to have highly complementary roles. The images on the distal retina are in much better focus than the ones in the proximal retina, with linear resolution better by a factor 10. They form the basis for shape vision in scallops. On the other hand, the proximal retina with its invertebrate-like ON-cells is much more sensitive to light, by a factor of 100X. It could underlie vision at night or in very turbulent water.

    Depolarizing and hyperpolarizing photoreceptor responses in the same animal. From Wilkens, Chapter 5 in Shumway and Parsons (Eds.), 2006.

    In the early 2000’s, the evidence vertebrates and invertebrates use both types of opsins started accumulating. In fact, we now know many examples of r-opsins in vertebrates and c-opsins in invertebrates. The most famous example, perhaps, is melanopsin, the r-opsin in intrinsically photosensitive retinal ganglion cells (ipRGCs), which regulate sleep and other circadian rhythms in mammals. We now think that r- and c-opsins evolved in the common ancestor of vertebrates, molluscs, arthropods and many other invertebrate families: the urbilateria. This is the posited great ancestor of multicellular animals with bilateral symmetry, the first example of which unambigously appeared in the fossil record 555 million years ago.

    Figure 2Alignment of genomes of different species and presumed ancestral linkage groups. Scallop (top left) has best alignment. From Wang et al. (2017).

    What did urbilateria look like? Recent evidence shows that urbilateria may have looked like… modern scallops! A recent genetic analysis in Wang et al. (2017) revealed a striking correspondence between scallop genome and reconstructed ancestral linkage groups. This suggests that ancient bilaterians have a similar karyotype to modern scallops. The opsins carried in all vertebrates and many bilaterally symmetric invertebrates must have existed all the way back in our common ancestor, which, like modern day scallops, would have contained both c- and r-opsins. It is tempting to say that the urbilateria might well have looked like modern day scallops. This is by no means a settled debate, however – many alternative body plans have been proposed for urbilateria.

    Visual behaviour

    ocean running GIFScallop running on the sea floor. GIPHY.

    Scallops have been preserved without much change for hundreds of millions of years – and indeed they are very well adapted to their environment. Unlike other kinds of bivalves – like mussels, which tend to stick in one spot – scallops move quite a bit. They have three basic moves:

    • Swim forward. They siphon water into their shells and expulse it in near the hinge, in short bursts. They look delightful doing so. See the gif above.
    • Swim backward (the jump or burst response). They close their shells very fast, which causes them to expulse water and move backwards in short bursts. This can lift a lot of dust as well, helping them escape. You can see this in action in the video below at the 25 second mark.
    • Righting reflex. They do a complicated spinning maneuver so that the larger valve ends up at the bottom of the ocean floor.

    They can both swim and jump in response to a decrease in light. This decrease in light is often caused by a predator – often a starfish or snail – coming a little too close for the comfort of the scallop. They will also close their shells in response to a decrease in light to block intruders, presenting their tough exterior to the predator.

    Scallops open and close their valves in response to their visual environment, influenced by the size of floating particles (turbidity) and their speed. They can also orient to light. Some species of scallops prefer to swim towards the light, while others avoid it.

    Interestingly, these behaviours persist with only one eye! Although scallops have many known visual behaviours, it’s still a mystery why their eyes are so numerous, and why they have such high resolution. Larger number of eyes can offer the scallop a larger field of view, but it’s unlikely that there’s any increase in field of view beyond 2-3 eyes given that each eye has a rather large field of view.

    Movement of scallops. Panels A, B: swimming; C: jump, D: upright reflex. Arrows labeled D: direction of movement, W: direction of water. From Wilkens, Chapter 5 in Shumway and Parsons (Eds.), 2006.

    It’s been speculated that some species of scallops migrate, and that they could use their eyes for visual guidance. Another theory is the multiocular overlap and high resolution give the scallop depth perception, which would be useful to avoid predators. A big impediment moving this research forward is that it has proven very difficult to record in the lateral lobes of the parieto-visceral ganglia of scallops, the site of visual processing (scallops have no brain).

    Conclusion

    Scallops have an amazing array of image-forming eyes that are highly sensitive to light. Their unusual retina has brought us insight into the evolution of modern day vertebrates, arthropods and mollusks. They support complex behaviours of which we probably know only a small fraction. As better recording tools become available, we will start to be able to study vision in this ancient and underappreciated animal. The biggest mystery in my book is why scallops have so many eyes. Perhaps once we understand their environment, behaviours and visual processing better, we will be able to untangle this mystery.

    in xcorr.net on January 07, 2020 03:01 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Sleepy animals, hunters and lizards that can (possibly) change sex: the Wildlife Research and Conservation 2019 conference

    Back in October, I had the pleasure, as editor of BMC Zoology, to attend the Wildlife Research and Conservation 2019 (WRC2019) conference in Berlin, Germany.

    Organized by the Leibniz Institute for Zoo and Wildlife Research  (Leibniz-IZW), the European Association of Zoos and Aquaria (EAZA) and WWF Germany, the conference was attended by delegates from more than 40 different countries. It took place in certainly the most unusual conference venue I had been to, the Botanic Garden and Botanical Museum Berlin.

    It was a full and varied schedule, with 2 parallel sessions being held over much of the conference. Each session was introduced via one of a series of excellent plenary talks including:

    • Barbara Taborsky on the significance of developmental plasticicity in dangerous environments
    • Michael Cant on kin selection theory and the evolution of social behavior and mating systems
    • John Fa on the consequences of the functional extinctions of animal species, focusing on the impact on the wild meat industry.
    • Michaela Hau on hormones, individual  plasticity and fitness
    • Mike Bruford on the genomic management of endangered species

    The sessions themselves were packed with an exciting programme of fascinating research, covering various aspects of wildlife research and conservation. It was difficult to choose which of those to highlight but here are a couple of my favorites.

    Taking the risk? Effects of aging on timing of hibernation and reproductive investment (presented by Claudia Bieber)

    When small animals hibernate, they greatly reduce the risk of being killed by predators. As the risk of being killed by predators during the active season is far greater, this results in a contrast which makes the life history theory  of these “small hibernators” interesting to explore. Beiber et al hypothesized that as these animals get older and therefore become less able to reproduce, they accept that they are more likely to fall victim to predators during the active season; they reduce the amount of time they hibernate to maximize their chances of reproduction. Indeed their research showed that older edible dormice are not only more likely to reproduce (delaying hibernation) but are likely to wake up from their hibernation earlier than younger dormice. This suggests that hibernation is used in two different ways – first to save energy during harsh environmental conditions and second as a life-history trait used to maximize fitness of the species.

    An edible dormouse sleeping

    Analysis of conflict reduction strategies in Iran; Case study: Kharvana district, East Azerbaijan province (presented by Nader Habibzadeh)

    With human population growth comes greater human activity that encroaches wildlife habitats and increases the risk of human-wildlife conflicts. This is particularly the case with carnivores. One such example is the wolf; herders in the Kharvana district, East Azerbaijan province have complained increasingly about wolves killing their livestock (goat and sheep). Habibzadeh described a study their group conducted that aimed to discover how these herders felt about the wolves and how possible it may be to use nonlethal methods of preventing the animal killing their animals. To this end, a number of herdsman in the area were interviewed.

    Wolves often hunt and kill livestock

    Most were concerned about their livestock not being adequately protected but although they did express a desire to kill the wolves, they were also open to the possibility of employing  non-lethal strategies as well; the most popular was using herd dogs to protect the goats and sheep in their care.

    In addition, students were encouraged to submit a poster for the conference poster competition. There were 121 posters in total, covering topics as varied as parasite infections in carnivores, social monogamy in monkeys, prediction models for suitable habitats for Persian leopards and more practical subjects such as smart and efficient box traps for capturing wildlife.

    Poster Competition

    BMC Zoology provided some sponsorship for the event and I was delighted to present the first prize for a poster entitled  “Sensitive window for sex determination in a lizard with environmental sex determination” by  Barbora Straková, Lukáš Kubička and  Lukáš Kratochvíl. The poster introduces preliminary work on the effect of the environment (predominately temperature in this case) on the sex determination in leopard geckos.

    Can leopard geckos change their sex?

    And to end…

    The conference was closed with a banquet at the beautiful Zoological Garden Berlin.

    The IZW provides a number of conferences throughout the year, the next one being the 6th  International Berlin Bat Meeting (IBBM2020), also taking place in Berlin, on the 23rd-25th March 2020.

    I look forward to attending another of their conferences soon.

    The post Sleepy animals, hunters and lizards that can (possibly) change sex: the Wildlife Research and Conservation 2019 conference appeared first on BMC Series blog.

    in BMC Series blog on January 07, 2020 02:06 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    U.S. Appellate Court Enforces CC’s Interpretation of NonCommercial | Creative Commons

    The U.S. Court of Appeals for the 9th Circuit reaffirmed Creative Commons’ interpretation of activities that are permissible under the NonCommercial (NC) licenses, which allow bona fide noncommercial reusers to hire out the making of copies of NC-licensed content, even to profit-making businesses such as Office Depot and FedEx Office. Below is an excerpt from the decision:

    Under the License, a non-commercial licensee may hire a third-party contractor including those working for commercial gain, to help implement the License at the direction of the licensee and in furtherance of the licensee’s own licensed rights. The License extends to all employees of the schools and school districts and shelters Office Depot’s commercial copying of Eureka Math on their behalf.”

    This is the second time a federal appellate court in the United States has adopted CC’s interpretation of NC. The first decision was published by the U.S. Court of Appeals for the 2nd Circuit last year (summarized here) and involved copying by FedEx Office at the behest of school districts admittedly using the works for noncommercial purposes. 

    CC’s position has been clear in both of these cases: so long as commercial actors are not acting independently for their own commercial gain but solely on behalf of noncommercial actors, they are protected by the license granted to the noncommercial actors.

    After all, entities must act through employees, contractors, and agents as a necessity. To require every teacher, employee (including part-time student employees), and third-party contractor making copies of NC licensed works to forego payment for their services would make it impossible for those types of licensees to use the works and facilitate sharing for noncommercial purposes.

    This is a huge win for educators, school districts and, most importantly, students.

    All students deserve access to effective open education resources (OER) and meaningful, inclusive learning opportunities. These NC-licensed OER will help ensure students have access to the effective learning resources they need by allowing schools to seek assistance in making copies when they do not have sufficient resources to do so on their own. Further, because these resources were created using public funds received by the New York State Education Department from the U.S. Department of Education, it’s important that they remain openly licensed.

    In October, Creative Commons requested permission to file an amicus and participate in oral argument. Our requests were granted, and our amicus brief (friend of the court brief) with the 9th Circuit became part of the record. Andrew Gass of Latham of Watkins argued the case on behalf of CC (video). 

    A very special thanks to Latham & Watkins for their hard work and diligence over the course of both the 2nd Circuit and 9th Circuit cases.

    The post U.S. Appellate Court Enforces CC’s Interpretation of NonCommercial appeared first on Creative Commons.

    in Open Access Tracking Project: news on January 07, 2020 01:11 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Crossfit wins $4 million sanction in lawsuit stemming from now-retracted paper

    A Federal court in California has ruled in favor of the popular training program CrossFit in its lawsuit against a nonprofit group — a competitor in fitness training — awarding the workout company nearly $4 million in sanctions.  Why are you reading about this case on Retraction Watch, you might ask? Well, at the heart … Continue reading Crossfit wins $4 million sanction in lawsuit stemming from now-retracted paper

    in Retraction watch on January 07, 2020 12:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    165 Free Episode Transcript

    I am including the transcript of this episode for free because it is an encore presentation of an interview that originally aired as BSP 72. It features Stephen Macknik and Susana Martinez-Conde, authors of Sleights of Mind: What the Neuroscience of Magic Reveals about Our Everyday Deceptions.

    in Brain Science with Ginger Campbell, MD: Neuroscience for Everyone on January 06, 2020 08:15 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Magic crystals and Nobel Science rules

    As one Nobelist retracted her Science paper, another Nobelist has stealthily corrected his. The correction opens new dimensions of probabilities and is indeed best kept hidden.

    in For Better Science on January 06, 2020 05:03 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Journal retracted at least 17 papers for self-citation, 14 with same first author

    A medical journal in Italy has retracted at least 17 papers by researchers in that country who appear to have been caught in a citation scam. The journal says it also fired three editorial board members for “misconduct” in the matter.  The retractions, from Acta Medica Mediterranea, occurred in 2017 and 2018, but we’re just … Continue reading Journal retracted at least 17 papers for self-citation, 14 with same first author

    in Retraction watch on January 06, 2020 11:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Weekend reads: Advice from an author with 18 retractions; ‘TripAdvisor for peer review’; theft, indictments, and prison

    Before we present this week’s Weekend Reads — the first of 2020! — a question: Do you enjoy our weekly roundup? If so, we could really use your help. Would you consider a tax-deductible donation to support Weekend Reads, and our daily work? Thanks in advance. The week at Retraction Watch featured: The retraction of a … Continue reading Weekend reads: Advice from an author with 18 retractions; ‘TripAdvisor for peer review’; theft, indictments, and prison

    in Retraction watch on January 04, 2020 01:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Prominent cancer researcher loses nine papers, making 10

    The Journal of Biological Chemistry (JBC) has retracted nine papers in bulk by a group of cancer researchers in New York led by the prominent scientist Andrew Dannenberg.  The work of Dannenberg’s group at Weill Cornell — and the figures in particular — has been the subject of scrutiny on PubPeer for more than two … Continue reading Prominent cancer researcher loses nine papers, making 10

    in Retraction watch on January 03, 2020 08:49 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Brain Science 2020 (Trailer)

    Brain Science is entering its 14th year and for the first time since 2008 I will be producing two episodes a month. They will come out  on the 2nd and 4th Friday every month.

    This trailer provides a brief introduction to new listeners and a few announcements. The next full episode will be released on January 10, 2020.

    in Brain Science with Ginger Campbell, MD: Neuroscience for Everyone on January 03, 2020 10:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Nobel winner retracts paper from Science

    A Caltech researcher who shared the 2018 Nobel Prize in Chemistry has retracted a 2019 paper after being unable to replicate the results. Frances Arnold, who won half of the 2018 prize for her work on the evolution of enzymes, tweeted the news earlier today: The paper has been cited once, according to Clarivate Analytics’ … Continue reading Nobel winner retracts paper from Science

    in Retraction watch on January 02, 2020 09:05 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    PLOS ONE retracts a paper first flagged in 2015 — and breaks the 100 retraction barrier for 2019

    A team of researchers in Saudi Arabia, led by an ex-pat from Johns Hopkins University, has lost three papers for problems with the images in their articles.  The three retractions pushed the journal — which has become a “major retraction engine” for reasons we explain here and here — over 100 for 2019. In December, … Continue reading PLOS ONE retracts a paper first flagged in 2015 — and breaks the 100 retraction barrier for 2019

    in Retraction watch on January 02, 2020 11:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Euroimmun: Graphic Art, Made in Germany

    EuroImmun is a German diagnostics biotech, founded by the most charming Professor Stöcker and recently sold to PerkinElmer. Elisabeth Bik checked some of company's papers, many coauthored by Stöcker himself, and she raised questions. The company says this "testifies to a lack of knowledge of the matter".

    in For Better Science on January 02, 2020 06:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Computational Psychiatry, Self-Care, and The Mind-Body Problem

    Schematic example of how the “mind” (cerebral cortex) is connected to the “body” (adrenal gland) - modified from Fig. 1 (Dum et al., 2016):
    “Modern medicine has generally viewed the concept of psychosomaticdisease with suspicion. This view arose partly because no neural networks were known for the mind, conceptually associated with the cerebral cortex, to influence autonomic and endocrine systems that control internal organs.”

    Psychosomatic illnesses are typically seen in pejorative terms — it's all in your head so it must not be real! Would a known biological mechanism lessen the stigma? For over 40 years, Dr. Peter Strick and his colleagues have conducted careful neuroanatomical tracing studies of motor and subcortical systems in the primate brain. A crucial piece of this puzzle requires detailed maps of the anatomical connections, both direct and indirect. How do the frontal lobes, which direct our thoughts, emotions, and movements, influence the function of peripheral organs?

    In their new paper, Dum, Levinthal, and Strick (2019) revisited their 2016 work. The adrenal medulla (within the adrenal gland) secretes the stress hormones adrenaline and noradrenaline. To trace the terminal projections back to their origins in the spinal cord and up to the brain, the rabies virus was injected in the target tissue. The virus is taken up at the injection site and travels backward (in the retrograde direction) to identify neurons that connect to the adrenal medulla with one synapse: sympathetic preganglionic neurons in the spinal cord. Longer survival times allow the virus to cross second-, third-, and fourth-order synapses. The experiments revealed that cortical influences on the adrenal originate from networks involved in movement, cognition, and affect.

    Modified from Fig. 5 (Dum et al., 2016). Pathways for top-down cortical influence over the adrenal medulla. Motor areas are filled yellow, and medial prefrontal areas are filled blue. (A) lateral surface. (B) medial wall.

    The mind–body problem: Circuits that link the cerebral cortex to the adrenal medulla

    “The largest influence originates from a motor network that includes all seven motor areas in the frontal lobe. ... The motor areas provide a link between body movement and the modulation of stress. The cognitive and affective networks are located in regions of cingulate cortex. They provide a link between how we think and feel and the function of the adrenal medulla.”
    Based on these anatomical results, the authors concluded with a series of speculative links to alternative medicine practices, including yoga and Pilates; smiling to make yourself feel better; and back massage for stress reduction.
    Because of this arrangement, we speculate that there is a link between the cortical control of 'core' muscles and the regulation of sympathetic output. This association could provide a neural explanation for the use of core exercises, such as yoga and Pilates, to ameliorate stress.
    • The orofacial representation of M1 provides a small focus of output to the adrenal medulla.
    This output may provide a link between the activation of facial muscles, as in a 'standard' or 'genuine' smile, and a reduction in the response to stress.
    • Another large motor output region is in postcentral cortex, corresponding to the sensory representation of the trunk and viscera in primary somatosensory cortex.
    This output may provide a neural substrate for the reduction of anxiety and stress that follows passive stimulation of back muscles during a massage.
    I was a bit surprised to see these suggestions in a high-impact journal. Which leads us to the next topic.




    Self-Care and Its Discontents

    What can be bad about trying to reduce daily stress and improve your own health?

    A recent paper by Jonathan Kaplan (Self-Care as Self-Blame Redux: Stress as Personal and Political)1 is critical of the way the self-care movement shifts the burden of alleviating stress-related maladies from society to the individual. Economic disadvantage is disproportionately associated with poor health outcomes, to state the obvious. Kaplan argues that focusing on individual self-care blames the victim for their response to a chronically stressful environment, rather than focusing on ways to effect structural changes to improve living conditions. In his efforts to highlight social inequities as a cause of stress-related illnesses, Kaplan goes too far (in my view) to discount all self-help practices that aim to preserve health.

    It can be empowering for patients to be active participants in their health care, whether at the doctor's office, in the hospital, or at home. One great example is CREST.BD: A Collaborative Research and Knowledge Exchange Network at the University of British Columbia. They've established the Bipolar Wellness Centre (online resource to support evidence-based bipolar disorder self-management) and developed a Quality of Life Tool (free web-based tool to help people with bipolar disorder and healthcare providers use CREST.BD’s bipolar-specific quality of life scale).2

    Then we have the wellness industry. Depending on what pop health source you read, there are 5, 45, 25, 12, 10, 10, 20 (etc.) essential self-care practices that you can incorporate into your daily routine (if you have the time and money). Wellness lifestyle insta-brands of the rich and famous hold up an impossible standard for upper-middle class white women [mostly]3 to attain. Perhaps our friendly neuroanatomists want to work on their core strength — they can follow @sianmarshallpilates for Pilates inspiration!


    Back to Kaplan's point about blame...




    It's easy to urge your followers to “stay happy!” and “move on!” if you have a net worth of $250 million, and if you don't have a psychiatric diagnosis. These 'Six Things' occupy a place in the pantheon of victim-blaming. People with mental illnesses are not effortlessly able to “stay happy!” or “move on!” or stop repetitive hand-washing (OCD) or avoid reckless spending (manic episode). And this is NOT their fault. And it doesn't make them mentally weak.

    Most psychiatric disorders, in essence, involve thoughts, emotions, and/or behaviors that spin out of control. Here, I'm using control in a colloquial (but not absolute) sense, meaning: it's frequently difficult to stop a downward spiral once it gets started. Although overly simplistic...
    • Major depression involves thoughts (ruminations) and feelings of worthlessness and utter bleakness that spin out of control.
    • Generalized anxiety disorder involves thoughts (worry) about an imagined awful future that spin out of control.
    • Panic disorder involves a thwarted escape or safety response to perceived danger that has spun out of control.
    • Mania involves elevated mood and intense motivation for reward that spin out of control.
    • Obsessive-compulsive disorder involves maladaptive repetitive behaviors (that spin out of control) meant to quell maladaptive worrisome thoughts that have spun out of control.
    • Borderline personality disorder involves overly intense negative emotions that spin out of control and lead to self-destructive behaviors.
    If people were able to control all this (without external intervention), the condition wouldn't reach the level of “disorder” — causing functional impairment and (usually) significant distress (but not always; e.g., people in the midst of a full-blown manic episode lack insight). I know this cartoonish level of description can raise the specter of free will and responsibility, especially in the context of criminal behavior. Are people with antisocial personality disorder not accountable for their horrible deeds? This timeless debate is beyond the scope of this post.


    Computational Psychiatry

    Or you can get mathematically fancy and formalize every single mental illness as a result of “faulty Bayesian priors”. Meaning, the brain's own “prediction machine” has incorporated inaccurate assumptions about the self or others or how the world works. A disordered Bayesian brain also ignores empirical evidence that contradicts these assumptions. The process of active inference — the brain's way of minimizing “surprise” when reconciling a top-down internal model and bottom-up external input  — has gone awry (Prossner et al., 2018; Linson & Friston, 2019). Although a sense of agency (or control) is a critical part of the active inference framework, I don't think an impairment in active inference is a choice. Or that one has control over this impairment. In fact, there's a Bayesian formulation of behavioral control (or lack thereof) that considers depression in terms of pessimistic, overly generalized priors, i.e. the depressed person assumes a lack of control over their circumstances.

    Learned Helplessness (Huys & Dayan, 2009).


    Using this mathematical model, you can confound the “stay happy!” crowd when you use all 24 equations to explain the concept of learned helplessness and its relevance to human depression.

    Maybe one day, Bayesians will have a stable of Instagram influencers. Get to work on your branding ideas!


    Footnotes

    1 Thanks to Neuroskeptic for tweeting about this paper, along with the quote that individuals may "end up being seen (and seeing themselves) as responsible for their own failures to adequately ameliorate the stresses that they suffer."

    2 Full Disclosure: my late wife was a Peer Researcher with CREST.BD.

    3 While searching for health and wellness Instagram influencers, I was pleasantly surprised to find @hellolaurenash (a Chicago-based blogger, editor, and yoga and meditation teacher who founded a holistic wellness platform for marginalized communities) and @mynameisjessamyn (a body-positive yoga expert who wants to change the largely white and thin face of yoga and make the practice more accessible to all). I know absolutely nothing about the prevalence of diversity among health and wellness Instagram influencers, just like I know absolutely nothing about Computational Psychiatry.


    References

    Dum RP, Levinthal DJ, Strick PL. (2016). Motor, cognitive, and affective areas of the cerebral cortex influence the adrenal medulla. Proceedings of the National Academy of Sciences 113(35): 9922-9927.

    Dum RP, Levinthal DJ, Strick PL. (2019). The mind–body problem: Circuits that link the cerebral cortex to the adrenal medulla. Proceedings of the National Academy of Sciences 116(52): 26321-26328.

    Friston K, Schwartenbeck P, FitzGerald T, Moutoussis M, Behrens T, Dolan RJ. (2013). The anatomy of choice: active inference and agency. Frontiers in Human Neuroscience 7:598.

    Huys QJ, Dayan P. (2009). A Bayesian formulation of behavioral control. Cognition 113(3):314-328.

    Kaplan J. (2019). Self-Care as Self-Blame Redux: Stress as Personal and Political. Kennedy Inst Ethics J. 29(2):97-123.  PDF.

    Linson A, Friston K. (2019). Reframing PTSD for computational psychiatry with the active inference framework. Cognitive Neuropsychiatry 24(5):347-368.

    Prosser A, Friston KJ, Bakker N, Parr T. (2018). A Bayesian Account of Psychopathy: A Model of Lacks Remorse and Self-Aggrandizing. Computational Psychiatry 2:92-114.

    Smash the wellness industry

    ... Wellness is a largely white, privileged enterprise catering to largely white, privileged, already thin and able-bodied women, promoting exercise only they have the time to do and Tuscan kale only they have the resources to buy.

    Finally, wellness also contributes to the insulting cultural subtext that women cannot be trusted to make decisions when it comes to our own bodies, even when it comes to nourishing them. We must adhere to some sort of “program” or we will go off the rails.

    in The Neurocritic on January 01, 2020 03:40 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    A look back at Retraction Watch in 2019 — and forward to our 10th anniversary

    In August 2020, Retraction Watch will turn 10 — a milestone we still can’t quite wrap our minds around. When we started the blog in 2010, we thought we might have enough material for a post or two a month. Little did we know that our little side gig would eventually lead to the world’s … Continue reading A look back at Retraction Watch in 2019 — and forward to our 10th anniversary

    in Retraction watch on December 31, 2019 11:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Psychiatrist who stole grant funds also engaged in research misconduct, says ORI

    Retraction Watch readers may recall the name Alexander Neumeister. In 2016, The New York Times reported on his dismissal from the New York University School of Medicine following claims of misconduct in a trial Neumeister was running. A lot has happened in the case since, including embezzlement charges for which he pleaded guilty. Now, the … Continue reading Psychiatrist who stole grant funds also engaged in research misconduct, says ORI

    in Retraction watch on December 30, 2019 11:01 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    ‘A satisfactory explanation was not provided’: Physicists in India lose third paper

    A team of physicists in India has notched their third retraction for problematic images and other issues that also have prompted at least four corrections of their work.  The authors, Sk. Shahenoor Basha, of the Solid State Ionics Laboratory at KL University in Guntur, and M.C. Rao, of Andhra Loyola College in Vijayawada, have lost … Continue reading ‘A satisfactory explanation was not provided’: Physicists in India lose third paper

    in Retraction watch on December 30, 2019 11:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Withdrawing life-sustaining treatment: To court or not to court?

    Research published in BMC Medical Ethics explores a recent ruling from the highest court in England. The ruling confirms that particular decisions about the removal of life-sustaining treatment can be made by clinicians and families, rather than by the courts, as had been thought. There is much to welcome here, says Professor Richard Huxtable, but also some potential pitfalls to avoid in the future.

     

    Patients with “prolonged disorders of consciousness” have regular sleep-wake cycles but either absent or reduced consciousness. Those individuals who lack consciousness are (controversially) referred to as being in the “vegetative state”; those with reduced consciousness are described as being in the “minimally conscious state”. Until a 2018 ruling from the Supreme Court in England, it was believed that only a court – not clinicians and not families – could decide whether life-sustaining treatment, in the form of clinically-assisted nutrition and hydration, could be removed from these patients.

    The Supreme Court’s ruling in the case of Re Y has confirmed that difficult or contested cases can still come to court. However, the ruling says that there is no requirement to bring more straightforward cases before a court – for example, if all those involved agree that this treatment should be withdrawn, then this may happen without consulting a judge.

    © Africa Studio / stock.adobe.com

    A positive development

    This looks like a welcome decision. Whether to remove life-sustaining treatment from any patient is inevitably a significant question. But it seemed odd to require a judge to make decisions for these particular patients, who were receiving this particular treatment (clinically-assisted nutrition and hydration).

    Before this ruling, loved ones and healthcare professionals would sometimes be in agreement about the treatment options for the patient, but still the case had to come to court, for a judge to declare whether or not clinically-assisted nutrition and hydration was in the patient’s best interests. Meanwhile, many other cases involving incapacitated patients – such as those with advanced dementia – or involving other potentially life-sustaining treatments – such as ventilation – could be made without necessarily involving a judge. Removing this routine requirement to come to court therefore appears to be sensible and also brings English and Welsh law in line with law elsewhere in the world.

     

    A risky development?

    But the judgment might also carry some less welcome, no doubt unintended, consequences. There are risks in two directions: either a patient might die too soon, or a patient might live too long.

    The risk of dying too soon arises in part from the logic that the courts had long deployed when making decisions about patients in the vegetative state. The key test (then and now) is whether treatment is in the “best interests” of the patient. Yet, some judges signaled that these particular patients had no interests. Some of those judges even indicated that, once the patient was diagnosed as being in the vegetative state, there was no legal basis for continuing to treat the patient.

    © shironosov / Getty Images / iStock

    Of course, that logic was previously only applied in (and by) courts to the particular individuals in question. Will or must that logic now move out of the courtroom and into the clinic or care home? If so, there appears to be a risk that treatment will cease once the diagnosis is settled – perhaps even regardless of what the family or the patient themselves might have wanted.

    This is a worrying possibility, but the bigger worry seems to lie in the opposite direction. Empirical research, especially that undertaken by the Coma and Disorders of Consciousness Research Centre, indicates that patients might end up living too long, again regardless of what they or their loved ones might wish. The data gathered by these researchers suggest that families, clinicians and the health system at large combine to promote “treatment-by-default”. The system appears to be fragmented and neither families nor clinicians appear to be well-informed about the legal principles and processes involved in making treatment and non-treatment decisions for incapacitated adults.

     

    Going forward

    Fortunately, there are recent, positive developments, which should help to ensure everyone is informed about the legal situation and that blanket assumptions are not made about what is best for an individual patient. Guidance from the British Medical Association emphasizes that the decision must indeed be about what treatment is in the “best interests” of this particular patient. The guidance, which is lengthy, helpfully includes summaries for families, which should help to ensure their participation in this type of decision-making.

    Ethically, this seems to be the right way forward: decisions should be made for, about and ideally with the individual patient. Yet, there are still major ethical questions to be answered. The “best interests” test operates not only in England, but in many other legal systems. Despite its familiarity, the test means different things to different people, and therefore can result in different decisions being made, even in seemingly similar situations. Our team in Bristol is undertaking a major research project, kindly funded by the Wellcome Trust, which aims to explore how “best interests” decisions are and should be made and, indeed, whether this is even the right approach to making healthcare decisions with and for individuals who lack the capacity to decide for themselves.

    Exploring whether “best interests” is best is timely. The same test essentially applies to children and similar questions have recently arisen in relation to critically ill infants, following widely-publicised legal cases like that of Charlie Gard. We hope, then, to be able to help clinicians and families in the future. For now, the ruling in Re Y looks positive, but ongoing discussion and research will be needed to ensure that any decision made is the right decision for the individual patient.

    The post Withdrawing life-sustaining treatment: To court or not to court? appeared first on BMC Series blog.

    in BMC Series blog on December 30, 2019 09:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Weekend reads: How to be a statistical detective; a $5.5 million settlement over hidden grants; 15 studies that challenged medical dogma

    Before we present this week’s Weekend Reads — the last of 2019! — a question: Do you enjoy our weekly roundup? If so, we could really use your help. Would you consider a tax-deductible donation to support Weekend Reads, and our daily work? Thanks in advance. The week at Retraction Watch featured: The story of what happened … Continue reading Weekend reads: How to be a statistical detective; a $5.5 million settlement over hidden grants; 15 studies that challenged medical dogma

    in Retraction watch on December 28, 2019 01:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    ‘Disbelief’: Researchers, watch out for this new scam involving journal special issues

    We’ve seen authors fake peer review by creating fake email addresses, and even companies that use photos of celebrities to lure unsuspecting authors. Now along comes a new scam, this one involving special issues of journals. In “Predatory publishing, hijacking of legitimate journals and impersonation of researchers via special issue announcements: a warning for editors … Continue reading ‘Disbelief’: Researchers, watch out for this new scam involving journal special issues

    in Retraction watch on December 27, 2019 11:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    How a plagiarized eye image in the NEJM was discovered

    The Images in Clinical Medicine section of the New England Journal of Medicine (NEJM) is prime real estate for physicians and others wanting to share a compelling picture with their colleagues. But earlier this month, an eye specialist in Michigan saw double when he looked at the Dec. 5, 2019, installment of the feature.  Depicted … Continue reading How a plagiarized eye image in the NEJM was discovered

    in Retraction watch on December 26, 2019 11:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Zombie fingers inside corroded nano-piecrusts

    Smut Clyde is back with more fraudulent nanotechnology. This time, he presents the works of Dhanaraj Gopi, who designs fabricated surfaces for surgical implants. In Photoshop, or with a pencil.

    in For Better Science on December 26, 2019 06:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    A ‘stress test’ for journals: What happened when authors tried to republish a Nature paper more than 600 times?

    Journal stings come in various shapes and sizes. There are the hilarious ones in which authors manage to get papers based on Seinfeld or Star Wars published. There are those that play a role in the culture wars. And then there are some on a massive scale, with statistical analyses. That’s how we’d describe the … Continue reading A ‘stress test’ for journals: What happened when authors tried to republish a Nature paper more than 600 times?

    in Retraction watch on December 24, 2019 12:18 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Everyone Loves a List, Right?

    Bidding farewell to the decade afforded us a wonderfully surprising trip down memory lane. Researching our milestones, top blogs and best tweets, showed us just how far we’ve come since we launched our first journal, PLOS Biology, in 2003.

    It certainly seems like OA publishing has always been an option for authors, but did you know it has only been ten years since PLOS first became fully sustainable through APCs? And we were one of the first to prove the viability of this model. Moreover, we also had to demonstrate that innovation, experimentation, quality, excellence and fairness could live together hand-in-hand as part of that model.

    We do not operate in a vacuum. We had a lot of help from some notable early adopters as well as those initial reviewers and academic editors that are the heart of any publishing operation. And our success could not have occurred without all of our authors, who trusted their research with a publisher that simply wanted to do things a bit different. Thank you. Thank you. Thank you. Now sit back and toggle through time as we celebrate 10 years of history, but with an eye on a bright future.

    2010:  The PLOS suite of Article-Level Metrics is available for each and every published article. 

    2011: Curated PLOS Collections on focused themes ease discovery and provide opportunity for breadth of coverage from PLOS journals, blogs and external sources all in one place.

    2012: The Global Participation Initiative was launched, which aims to tackle barriers to publication based on cost, specifically addressing the lack of funding for publication faced by authors in many countries.

    2013: The Accelerating Science Awards Program (ASAP) co-sponsored by PLOS, honors pioneers using Open Access research to benefit society. The program attracts 200 nominations from 30 countries. 

    2014: The PLOS Data Policy ensures a research article’s underlying data is available to the science community, promoting new discovery, replication and validation (arguably bringing the open data conversation into the mainstream). 

    2015: PLOS Communities grow to include Paleontology and Ecology with active social media and live blogging at scientific conferences, expanding the impact and contributions of authors and early career researchers.

    2016: 1.) More granular credit for authors with standardized CRedIT roles. 2.) Early sharing of work with preprints and direct transfer to PLOS from bioRxiv. 3.) Implementation of ORCID iDs for authors helps ensure work is properly attributed—Learn more

    2017: 1.) Journal and Collections publishing platform AmbraTM licensed open source. 2.)  PLOS Channels with external editors provide opportunity for discovery, exploration and contextual insights

    2018: PLOS Collaborates with Cold Spring Harbor Laboratory to enable preprint posting on bioRxiv

    2019: Published Peer Review history launches

    Top 10 PLOS blogs by page views (note: these blogs are relatively recent and their ‘page views’ totals may be due to the fact that we have a lot more followers now):

    1.     PLOS Journals Now Open for Published Peer Review
    2.     Making a home for the physical sciences and engineering at PLOS ONE
    3.     Streamlined Formatting PLOS Articles
    4.     PLOS Appoints Dr. Joerg Heber EIC of PLOS ONE
    5.     Transparency, Credit, and Peer Review
    6.     Author credit: PLOS and CRediT Update
    7.     CEO Letter to the Community
    8.     Power to the Preprint
    9.     You’ve completed your review, now get credit with ORCID
    10.     PLOS, Cold Spring Harbor Preprint Agreement

     

     

     

    in The Official PLOS Blog on December 23, 2019 03:31 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Why the holidays are the loneliest time for seniors

    The winter holidays are a time to celebrate family, friends, and community. But for the millions of older adults worldwide who have no family, few friends nearby, or are lonely and socially isolated, December is far from the most wonderful time of the year. A survey carried out by AARP in 2017 found that 28 percent of U.S. adults ages 50 and older report that they’ve felt lonely during a holiday season over the past five years, and nearly half (43%) have worried about a friend or family member who was lonely during the holidays.

    Christmas season may sharpen the dull pains of loneliness, as older adults yearn for their loved ones who have died, or reminisce about happy celebrations in their family home that they have since abandoned for residence in a long-term care facility. Yet social isolation among older adults is a sweeping social problem whose impact extends beyond the family-centric weeks between Thanksgiving and New Year’s. Rising numbers of older adults worldwide have no living or nearby kin. In the United States, nearly 7% of adults ages 55 and older have neither a living spouse nor biological children and 1% have no partner/spouse, children, biological siblings, or biological parents – with these rates rising across successive cohorts. Worldwide rates of kinlessness, or having neither a spouse nor children, range from a low of just 2% in China and Korea, to more than 10% in wealthy western nations including Canada, Ireland, the Netherlands, and Switzerland.

    Rising numbers of kinless older adults are a result of demographic trends over the past century. Declining birth rates mean that older adults today have fewer children than in the past, especially in societies that have maintained restrictive population policies, and where childlessness rates are high. Due to processes of urbanization and globalization, adult children may migrate far distances from their aging parents to pursue rewarding economic opportunities. Moving from the countryside to the city, or from one’s hometown to more lucrative opportunities overseas are especially common among young adults in Asia and the global south. Rising rates of divorce worldwide mean that older adults may no longer live with a spouse. Women are especially likely to grow old alone, both because they tend to outlive their husbands and because they are less likely to find another partner after being divorced or widowed.

    Being kinless isn’t the same thing as being lonely, however. Unmarried and childless adults tend to have larger networks of friends, compared to their peers with spouses and children. Friends can be an essential source of practical support and emotional uplift for older adults, especially in countries where non-family ties are valued as highly as familial ties. And even older adults with family by their side are not necessarily spared of emotional loneliness, or a lack of intimacy and closeness in one’s relationships. An older adult who has a stale marriage or chilly relationship with her adult children might feel a sense of aloneness and alienation even when surrounded by others at a lively family dinner. One in four married older adults reports feelings of emotional loneliness, and these rates are even higher for those whose spouses are chronically ill, who have a dissatisfying (or non-existent) sexual relationship, and for whom communication is silent, stilted, or combative.

    An absence or shortage of satisfying social and emotional ties can be harmful and even deadly to older adults. Loneliness and social isolation are serious public health concerns because they are linked to far-ranging health problems including difficulty sleeping, poor cardiovascular health, high blood pressure, depressive symptoms, compromised immune function, and dementia, each of which is linked with mortality risk. The societal problem of loneliness and the health toll it exacts on older adults is so profound that in early 2018, the United Kingdom appointed its first-ever Minister for Loneliness, alongside the launch of a national charitable Campaign to End Loneliness.

    Old age need not be a time of loneliness and isolation, however. Innovative clinical practices, public policies, and community programs can help mitigate against loneliness and its personal toll. Health care providers can screen older adults for loneliness as part of their usual geriatric assessment, identifying and providing supports for those at particular risk. Meal delivery programs like Meals on Wheels provide not only nutrition to older adults, but are effective in reducing their feelings of loneliness. Publicly-funded volunteering programs like Senior Corps that provide older adults an opportunity to learn new skills, interact with others, and give back to their communities help to reduce loneliness and provide the physical and emotional health boosts that come from meaningful social engagement. Continued investments in programs that enhance older adults’ social integration will have payoffs that linger long after the holiday season has passed.

    Featured image credits: Tejas Prajapati via Pexels

     

    The post Why the holidays are the loneliest time for seniors appeared first on OUPblog.

    in OUPblog - Psychology and Neuroscience on December 23, 2019 01:30 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Endocrinology researcher in South Korea scores four retractions in a year

    Hueng-Sik Choi, a researcher at Chonnam National University in Gwangju in South Korea, is up to four retractions for image manipulation. The latest retraction for Choi, for a 2006 paper titled “Orphan nuclear receptor Nur77 induces zinc finger protein GIOT-1 gene expression, and GIOT-1 acts as a novel corepressor of orphan nuclear receptor SF-1 via … Continue reading Endocrinology researcher in South Korea scores four retractions in a year

    in Retraction watch on December 23, 2019 11:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Aging Healthily: Perspectives from the Older Adults on Mindfulness and Exercise

    The number of people over 60 years old in the United States is expected to double by 2050. And this presents a growing societal challenge: How do we ensure that this population has the skills necessary to lead a healthy lifestyle and remain socially engaged?

    Two promising interventions that could benefit older adults cognitively, emotionally, and physically are exercise and mindfulness. Exercise helps prevent falls, delays disability, enhances cognitive functioning, improves depression, and reverses metabolic diseases. Mindfulness reduces stress, worry, and loneliness; decreases systemic inflammation; and improves mental health, sleep, awareness, self-efficacy, cognitive functioning, and psychological well-being. Given these benefits of exercise and mindfulness, we wanted to know what it is that keeps older adults from initiating and maintaining such health-promoting activities. As it happens, researchers rarely explore the preferences and motivations of older adults in this context, and this lack of research spawned our qualitative study, published in BMC Geriatrics, comparing elderly participants’ perspectives of the benefits and barriers of initiating and maintaining mindfulness and exercise.

    How we explored these questions

    We carried out focus groups with 41 adults aged 65–85 who had recently participated in a clinical trial involving Mindfulness-Based Stress Reduction (MBSR) training, structured exercise, or a combination of both. These three approaches are all effective and feasible interventions that promote healthy behaviors in older adults who are in the midst of mental, social, emotional, and physical change, but we wanted insights into which was most effective from the perspective of study participants who experienced each approach. We used a semi-structured interview to ask participants open-ended questions regarding the benefits, barriers, and facilitators of participating in mindfulness and/or exercise interventions.

    Benefits and barriers

    Most participants reported mental, physical, and social improvements as part of their respective interventions. Participants indicated that the mindfulness training increased their awareness and self-reflection and fostered a more self-accepting attitude. They also saw improvements in their self-care habits and reported having better familial and social relationships. In fact, the social benefits and sense of community were some of the primary motivators for older adults to continue the exercise and/or MBSR interventions. Participants also indicated that the structured exercise led to enhanced mobility and stronger muscles. Overall, our findings suggest that the potential benefits of mindfulness training may encompass a broader range of benefits than that of exercise alone, as mindfulness included positive effects on family, social, and marital functioning.

    The main barrier to both the exercise and mindfulness groups was time-management. Many participants stated that they were very busy despite being retired. Other barriers included interruptions at home and frequent travelling. This aligns with previous studies that suggest that older adults resist exercise due to competing priorities, among other factors.

    Motivation and opportunity

    Overall, the study indicates that mindfulness training, along with exercise, can serve as a tool to cultivate important healthy lifestyle qualities among older adults, and that the benefits of exercise and MBSR are a motivation in and of themselves. However, the research on how to motivate older adults to initiate healthy behavioral changes is lacking. Based on the responses, if it weren’t for the accountability they felt to participate in the research or the incentives provided by the research team, these older adults might not have ever started making the healthy behavioral changes to begin with. This suggests that older adults may need more incentives to begin and maintain behavioral changes other than for their own health benefit.

    We also inquired as to what type of community resources could facilitate the practice and maintenance of exercise and mindfulness. Some of the suggestions from the participants included offering exercise and/or mindfulness classes at libraries, community colleges, Silver Sneakers, OASIS international, or community centers and free or donation-based yoga in public parks. The inclusion of mindfulness practices and exercise in community programs could serve as a stepping-stone to ensure that older adults are well integrated in society.

    The post Aging Healthily: Perspectives from the Older Adults on Mindfulness and Exercise appeared first on BMC Series blog.

    in BMC Series blog on December 23, 2019 09:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Weekend reads: 100 fake professors; study on police killings retracted; false data won’t scuttle company buyout

    Before we present this week’s Weekend Reads, a question: Do you enjoy our weekly roundup? If so, we could really use your help. Would you consider a tax-deductible donation to support Weekend Reads, and our daily work? Thanks in advance. The week at Retraction Watch featured: The top retractions of 2019; Two retractions and three corrections … Continue reading Weekend reads: 100 fake professors; study on police killings retracted; false data won’t scuttle company buyout

    in Retraction watch on December 21, 2019 03:03 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Girls are using data analytics to figure out what’s fair – and what they are capable of

    Girls in under-served communities are learning data science in a program by Girls Inc. of NYC and the Elsevier Foundation

    in Elsevier Connect on December 20, 2019 04:36 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Allegations linking Sci-Hub with Russian Intelligence

    The Washington Post reports that the US Justice Department has launched a criminal and intelligence investigation into Alexandra Elbakyan, founder of Sci-Hub

    in Elsevier Connect on December 20, 2019 02:29 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Duke misconduct case prompts another expression of concern — but not a retraction

    Here’s an expression of concern that raised some eyebrows around the Retraction Watch HQ. The American Journal of Respiratory and Critical Care Medicine has issued an EoC about a 2007 paper by a group of researchers at Duke University (well, at the time, at least) while acknowledging “irregularities in the procedures of a lab that … Continue reading Duke misconduct case prompts another expression of concern — but not a retraction

    in Retraction watch on December 20, 2019 11:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    BS 164 Thirteenth Annual Review Episode

    This is our 13th annual review episode. I share a few highlights from episodes 153-163 and include a few extra reflections on the recent 4-part series about the neuroscience of Consciousness.

    This month's episode transcript is included for FREE.

    Partial list of Books/Authors featured in 2019:

    Please Visit Our Sponsors:

    TextExpander at http://textexpandercom/podcast

    BetterHelp at http://betterhelp.com/ginger

    Announcements:

    Connect on Social Media:

    Contact Dr. Campbell:

    • Email: brainsciencepodcast@gmail.com
    • Voicemail: http://speakpipe.com/docartemis

    in Brain Science with Ginger Campbell, MD: Neuroscience for Everyone on December 20, 2019 10:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Listening to women’s voices: the experience of of unplanned out-of-hospital birth in paramedic care in Queensland, Australia

    Paramedics managing out-of-hospital birth

    In Australia, approximately 1,974 births, representing 0.7% of total births, occur prior to arrival at hospital each year. This population, although a small proportion of paramedic caseload, is associated with considerable perinatal mortality and morbidity and possesses factors that may complicate paramedic clinical management. While many births that occur in paramedic care are uncomplicated, paramedics are expected to use evidence-based guidelines to identify, manage, and refer patients with high-risk conditions or emerging problems related to pregnancy, recognize and manage a severely compromised pregnant patient, and provide appropriate care during the intrapartum and postpartum period.

    Our new article published in BMC Pregnancy and Childbirth, a companion to our recent article in BMC Emergency Medicine, explores women’s experience of unplanned out-of-hospital birth attended by paramedics in Queensland, Australia. Interviews with twenty-two women revealed that the participants had various reasons for their births occurring in an unplanned and unexpected way, but their reasons were not necessarily the normally assumed ‘precipitate’ or ‘multiparity’ birth.

    Women negotiating the system

    This research identified a relationship between perceived deficits in the current Australian Maternity Care system and the decisions women made when labor began. When paramedics became involved, some women described very positive interactions. However, it was clear that the need for paramedics was sometimes based on the need for medical backup ‘just in case something went wrong.’ These women did not doubt their ability to birth; they were knowledgeable about the birth process and expressed a desire to have components of their birth plan acknowledged. Yet, when these needs conflicted with a paramedic’s care decisions, the women described feeling disempowered. Some women did not understand why paramedics had to make, during what was meant to be a planned natural experience, a series of medical interventions to which they did not consent.

    Woman-centered paramedic care

    Participants who had positive birth experiences spoke of paramedics who were professional, had good communication skills, and were empathetic and reassuring. They were particularly impressed with paramedics who provided care holistically, acknowledging that the woman was an individual who had physical, emotional, social and spiritual needs. It was amusing during the interviews when women became excited describing an ‘amazing’ and ‘wonderful’ paramedic who put a load of washing on, provided her time for a shower, and thoughtfully left a towel with the baby’s scent on it for the family dog. These were the actions of a paramedic who was not only confident and comfortable with the physical birth process, but also practiced and appeared to appreciate a holistic approach to person-centered care specific to the out-of-hospital environment.

    The authors’ perspective

    The experience of conducting this study and interpreting women’s stories on a topic such as this ignited a personal and highly emotional reaction for the authors. The authors believe that although a baby is born into a family and society, the event of birth itself belongs to the woman – she births the baby. It is a time of joyful anticipation, a moment that should surpass her expectations, and a time when she should feel the most powerful and fearless. However, during these interviews the authors expressed disappointment, not for the various ‘professions’ involved in these narratives, but towards some of the individual paramedics, midwives, and doctors that women described in their narratives. The authors took comfort in the fact that when positive care experiences were portrayed, those experiences were exceptional in nature.

    This research motivated the authors to reflect on their own perspective of patient care as a midwife and paramedic. The phrase ‘rest and reassurance’ is one of a few descriptors in the patient care report form that addresses providing psychological support during patient care episodes. It is interesting to reflect on what value paramedics place on this phrase; is it a throwaway line, just a box to be ticked routinely, or do paramedics actually demonstrate it from the patient’s perspective?

    Researching the ‘person’ as well as the physical body

    We acknowledge that the scope of skills expected of paramedics has increased over the last 20 years, as has the scope of paramedics’ roles and responsibilities. As such, there has been a preoccupation within paramedic education and research that has a purely biophysical approach. However, from this research and others referenced within it, when patients are asked what they value most from paramedic care, they reply ‘professionalism.’ This includes confidence in care, clinical competence, and a high level of clinical knowledge. They also want person-centered care which is respectful and responsive to the preferences, needs, and values of the individual patients themselves. We intend that this research will contribute to making necessary improvements in paramedic education and providing a basis for a paramedical model of care that is patient-centered, embodies respect and empathy for patients, and requires paramedics’ empathy and interpersonal communication skills.

    The post Listening to women’s voices: the experience of of unplanned out-of-hospital birth in paramedic care in Queensland, Australia appeared first on BMC Series blog.

    in BMC Series blog on December 20, 2019 09:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    4 points from experts on driving open science

    Researchers and librarians need to talk to enable open science – and other takeaways from Elsevier’s Open Science Practices and Perspectives seminar in Poland

    in Elsevier Connect on December 19, 2019 10:46 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    2019: a lightly bamboozled review of the year in neuroscience

    I think we all need a little lie down

    Wow, who’d have thought 2019 would be the year we cracked the brain? Feet up, job done everyone. No doubt you, like I, was stunned when it turned out the zona incerta was the key to unlocking everything. That innocent bundle of neurons, tucked up neatly betwixt the thalamus and the brain’s basement, revealed as the nexus through which all action and perception flow.

    (But then we always suspected the cortex is a mere blanket for regulating the temperature of the crucial parts of the brain. And while some continue to protest that “the folds of human cortex are what makes us unique!”, they are denying the simple, near universally-accepted explanation — greater surface area improves heat dissipation.)

    Welcome one, welcome all to the fourth annual review of the year in neuroscience from The Spike. The above is all fantasy, of course. (Except the bit about the cortex). But some key areas of neuroscience have seen an explosion of exciting and provocative work this year. So come with me as we take a tour of the three that caught my eye: the specialness of the human brain; the sci-fi world of filming voltage; and the mind of the worm laid bare.

    1. “What’s so special about being human?”

    Yes, it’s Jose Mourinho. If I had to explain, it wouldn’t be funny.

    What’s so special about the human brain? A question broad and deep, one that has obsessed thinkers since Antiquity. Our once god-like status, apart and above from all animals, now steadily eroded by science, from natural selection placing us as but one species evolved from and in parallel to many others, to genetics putting the boot in by showing that we share 90 percent of our DNA with cats. Including my cat Bob, and he constantly falls off the back of the sofa.

    You’d have thought neuroscience would have been all over the question of what’s special about the human brain, what with it being the study of the brain and all. But what we could do has been very limited. Largely we have only been able to observe behaviour, stuff we can do that other species cannot: talking endlessly in complex grammars, voting for bell-ends, that sort of thing. And supplement these observations with crude measures of the brain’s structure — how many neurons it has, which bits are thicker or larger than others, which bits are folded — compare them to other species, and going “ooo look it’s different”.

    But to understand why the human brain is “special” we need some kind of theory as to why any of those crude brain differences would make any contribution to that specialness. For a start, to know what’s different about the neurons themselves: what’s different about the types of neurons that exist, or the signals they send, or both. Which is the preserve of us “systems” neuroscientists. Yet systems neuroscience hasn’t had much to say about this question, because of the deep problems of measuring neurons in humans. Until, that is, this year.

    A. Special codes
    In a brave paper, Pryluk and colleagues attempted a direct comparison of how the code used by neurons differed between humans and monkeys. They took long recordings of single neurons from the amygdala and cingulate cortex of monkeys. And compared them to similar recordings from the same regions in humans. These human recordings are ultra-rare: they came from patients with epilepsy that was both so serious and so unresponsive to drugs that they were being prepped for surgery to remove the part of the brain causing the seizures — and to find that part, they had electrodes implanted for a week or more. And while these electrodes were in there, and while laying in their hospital bed, the patients graciously agreed to do a series of tasks for the experimenters.

    With these precious data to hand, Pryluk and friends asked a straightforward question: how much information are these neurons sending? In practice, this was a tough question to ask, as there are all sorts of things to compensate for, like correcting for differing firing rates across neurons and between species. But if we believe their measurement of how much information a neuron sends, their end result is clear. Human neurons in both the amygdala and cingulate cortex send more information — in that they are closer to the maximum possible rate of information sending — and do so more efficiently : they send fewer spikes for the same amount of information. Which means? Who knows. But their results point to human cortex having an increased capacity, so that much more information can be represented across a population of neurons, but at the cost of less robust coding — if fewer spikes are used, so the message being transmitted is more sensitive to failure and noise. And as know, the human brain is very sensitive to failure.

    B. Special neurons?
    While the Pryluk paper hinted at something special about how the human cortex encodes information, it didn’t tell us anything about whether this is because the types of neurons are special to humans. We get much of our deep understanding of types of neurons from mice, thanks to their being the workhorse of genetics (a good thing that the actual workhorse is not the workhorse of genetics, otherwise Janelia Farm would be, literally, a farm. And about 10000 square miles in size). Hence “what’s so special about the human brain?” translates in genetics to: how do we differ from mice?

    A Nature paper from the Allen Brain Institute tackled this question head-on by directly comparing the gene expression between the cortex of the human and mouse. To do that, they first had to solve the small problem of accurately sequencing the RNA-expression of single neurons in the human cortex. Having cracked that, they then grouped all their neurons into types according to the similarity of their expressed genes. The result? 69 different types of neurons in the human cortex, of which 24 are excitatory (as in, they expressed glutamate) and 45 are inhibitory.

    So which of these 69 types of neuron are unique, are responsible for endowing we humans with our special brain thinking stuff? None. All 69 are also closely matched in the mouse cortex. The major difference is not the type of neuron, but where they are found. In mice, all 24 types of excitatory cell stick to one particular layer of the cortex. In humans, many of those same excitatory types appear in more than one layer of the cortex. And why would this make the human cortex special? Who knows.

    Yet their results are strong confirmation of what we all know: evolution is a tinkerer. We may have diverged from rodents about 90 million years ago, but in something as complex as the mammalian brain, in which genes define merely an outline of the details of the adult brain, big changes will almost always be catastrophic. So the main difference between the neurons in the mouse and human cortex is not in the proliferation of brand new types of neurons, but in the repurposing of that is already to hand. Oh, and the fact that mice have about 10 millions neurons in their cortex, and we have about 17 billion.

    C. Dendrites predict intelligence?
    We like to think humans are the most intelligent species on the planet, in the face of overwhelming evidence to the contrary. A provocative paper at the very end of 2018 asked what it is about our neurons that makes us intelligent. And the answer is: the more complex the dendrites of pyramidal neurons in our cortex, the higher our IQ. Wow!

    Well, maybe. Such unusual claims deserve close scrutiny. After all, extraordinary claims demand extraordinary evidence. As we have no theory which predicts that more complex dendrites have anything to do with intelligence, so we need some pretty compelling evidence to believe this is not just happenstance correlation. The researchers obtained another rare sample of human cortical neurons: in this case, from bits of the temporal lobe removed during brain surgery, placed on ice, then popped into the experimental set-up as soon as practically possible while the neurons still lived. They took a range of measurements from the neurons. Each patient took an IQ test. And the researchers correlated some of the measurements with the IQ scores. Why these measurements? No reasons given — so already alarm bells are sounding about what other measurements were correlated with IQ, found to be lacking, and omitted from the paper.

    Is the evidence compelling? No. The key evidence is the correlation between the total length of the dendrites and the IQ of the patient. Namely, this figure:

    From Goriounova et al (2018). Each symbol is the average over the neurons in one subject; error bars are one standard deviation. The correlation (r value), regression (black line), and confidence interval of regression (blue shading) all appear to be taken from the symbols — i.e. the average scores.

    Leaving aside the fact that this is the best correlation they have, and it is still weak (explaining 26% of the variance), take a closer look. Each data-point is a patient, so the value for the length of the dendrites is an average over the measured neurons in that patient. Now you don’t have to be a neuroanatomy geek to know that pyramidal neurons come in a bewildering variety of shapes and sizes, so averaging over them is a bit…. Well, charitably we’d call it weird. More bluntly, meaningless. And I’ve just told you that human cortex contains about 24 types of excitatory neuron, and most of those are some kind of pyramidal neuron. This correlation contains just 72 pyramidal neurons in total. So it hideously undersamples the diversity of pyramidal neuron dendrites in human cortex.

    Worse, the above figure and others in the paper are textbook examples of how not to compute a correlation. The correlations are computed using averages — without taking into account how wrong those averages might be. And they could be so wrong that the correlation disappears completely. Indeed, looking at the range of data variation (the error bars) in the above figure, I’d wager the correlation would indeed disappear if tested properly (more on this in the Appendix below).

    Finally, a simple thought experiment. These neurons happen to come from the temporal lobe of the cortex, a region plausibly involved in some kind of “thinking” that might contribute to an IQ score. But that was just because these patients had epilepsy, and the temporal lobe usually contains the region that starts epileptic brain activity. But what if these samples had been from primary visual cortex (V1)? They’d find the same diversity of sizes of pyramidal neuron dendrites, because types of pyramidal neuron are largely consistent across the cortex. But if they’d reported a correlation between the size of dendrites in V1 and a person’s IQ score, who would have taken them seriously?

    2. Voltage imaging explodes

    People often say to me “Mark, you Nietzschian ubermensch, what’s the next big thing in systems neuroscience?” And I reply: “well, Mum, since you asked: it’s voltage imaging”.

    To know the brain, we want to know what messages big groups of neurons are sending, and what they are receiving. Voltage imaging is the solution: the filming of neurons as they glow in proportion to their membrane voltage, a real-time readout of the detailed electrical activity of every neuron in a population. If we can get it to work in mammalian brains, it will be a mind-blowing tool for understanding the signals neurons use — not recording just every spike they send, but also the spikes they receive.

    Indeed I’ve been banging on about voltage imaging being the way forward for understanding neurons for some time (like here in 2017). That’s in part because I’ve had the rare privilege of working with voltage imaging data from the gorgeous sea-slug Aplysia since 2011, thanks to Angela Bruno and Bill Frost. So this year was, for me, deeply exciting.

    We had one, two, three major papers all announcing new types of voltage sensors that work beautifully in mammalian neurons. That last for ages, have big signals, and can all record detailed voltage traces from multiple neurons at once.

    Why is this huge? I wrote a whole piece about why: but the key idea is simple. Voltage imaging combines the strengths of calcium imaging and recording with electrodes, while solving their problems. With calcium imaging we can see neurons, know neurons, and tag specific neuron types to record from, but calcium itself, the thing being measured, is a slow and indirect measure of spikes. With electrodes we get fast, direct measurements of spikes, but don’t know which neurons or exactly where or what types. Voltage imaging gets all of that: we can see the neurons, know the neurons, tag specific types of neurons, and still record spikes quickly and directly. And more: because we can see not just spikes, or things that are proxy for spikes, but also the voltage changes between spikes — the receiving of inputs!

    Researchers of invertebrate nervous systems have been making major breakthroughs using these sensors — in dye form — for decades. So with voltage imaging about to become a working reality in mammals, the mind boggles about the major breakthroughs that beckon.

    3. Mind of the worm, complete

    For the last 30 years, waggish commentators on neuroscience have defaulted to the slogan “but we’ve known the complete wiring diagram of C Elegans brain since 1986, and we still don’t know how the mind of the worm works!”. Sometimes this is pointed at theorists’ lack of progress. Or at the pointlessness of doing neuroscience in advanced animals. Often it is pointed by some group of neuroscientists at another group of neuroscientists. Whomever it’s pointed at, we’re all meant to hang our heads in shame, and contemplate the meaningless of our research, nay the very purpose of studying the brain.

    Well, we have news for these people. The wiring diagram was never complete. Not even close.

    The White et al paper in 1986 was a staggering, decade-long effort to map the nervous system of the tiny nematode worm C Elegans by hand. But was clearly incomplete. It mapped 279 of the 302 neurons in the hemaphrodite of the worm; and evidently missed an unknown number of connections between even them. Chklovskii’s lab updated the wiring diagram in 2011 with some of the missing connections.

    This year, we finally got something close to complete: a detailed mapping of the wiring diagram in both sexes of C Elegans (with free pull-out and keep poster in Nature!). All 302 neurons in the hermaphrodite; and 385 neurons in the male. And the 6334 connections in the hermaphrodite, and 7070 in the male — including which motorneurons connected to which muscles. And a new type of information too, the strength of the connections, given by the size of the synapse of one neuron onto another. Phew. Epic effort by Cook et al in the Emmons lab.

    So after a detente of a few years, can we now expect the wags to return with “but we’ve known the complete wiring diagram of C Elegans brain since 2019, and we still don’t know how the worm works!” and for us to take them seriously? Nah. For a start, this wiring diagram is still not technically complete. There are differences between the sexes in the same neurons, so the study possibly missed some connections. Some connections between neurons were not detected but assumed to be there due to “repetitive” wiring. Worse, the wiring diagram is still not a single animal, but a mosaic reconstruction from multiple animals. So individual variations due to the happenstance of development will have been mixed together.

    Indeed, this gargantuan effort is also a case-study in arguing about the usefulness of connectomics. For what did we learn? Not a great deal to be honest. Just like the sequencing of the human genome, the value of this updated wiring diagram will be in how it’s used, not in it’s mere existence.

    For starters, we can all look forward to a swathe of by-the-numbers network theory studies now, where this new wiring diagram is analysed to death for its modularity and topology and wiring efficiency and core-periphery and spatial embedding and all the other stuff. More interestingly, people will need to take another look at the dynamics the wiring diagram imposes. Most obviously, the high-profile work of Laszlo Barabasi’s team who used “network control” to predict which neurons are crucial to locomotion in C Elegans based solely on the wiring diagram, and then had those predictions confirmed by ablating those neurons. With a new, more complete wiring diagram now to hand, presumably someone is checking that the network control theory makes the same predictions about which neurons are crucial — for if it doesn’t, then the whole idea sinks.

    Indeed, having laid out their brand new wiring diagram, Cook et al seem to take dim view of this line of work. In a not-so-subtle admonishment to 20 years worth of work, they conclude their paper with “modelling the functions of the nervous system at the abstracted level of the connectivity network cannot be seriously undertaken if a considerable number of nodes or edges (for example, edges that represent electrical couplings) are missing.” In a scientific paper, that’s about as harsh a burn as you’re ever going to see.

    But, hey, we also got further evidence this year that angry spats about wiring are all a waste of time. We could instead just ignore the individual neurons. The pioneering work of Manuel Zimmer’s lab in 2015 showed we could record tens of neurons at the same time in the nervous system of a freely-crawling C Elegans, discard the neurons by projecting their activity down into a handful of dimensions, and then successfully map the worm’s different types of movement on to different regions of that low-dimensional space.

    This year, Brennan and Proekt took the next step in understanding the mind of the worm: make a model of the low-dimensional dynamics of its brain. Sounds dull. But this model does two important things. For one thing, it solves the pesky variability between brains— even in C Elegans, there are big differences between animals in the patterns of activity of the same neuron during the exact same behaviour. So creating a model of the dynamics common to all the neurons means that it applies across all worms, even those not used to make the model. And the second thing: building a generative model means Brennan and Proekt could create new brain dynamics from a given starting point, then see when the model’s activity changes to a new part of the low-dimensional space, and so predict when the corresponding behaviour will happen. Even better: they can do prediction in worms not used to create the model.

    Using the Zimmer lab’s data, Brennan and Proekt did just that. Indeed, even with just 15 neurons in common between recordings, they were successful: they could build a model that captured the joint activity and its changes in just two dimensions; generate from that model neural dynamics and corresponding behaviour that matched the distributions of forward, backward, and backing-up locomotion. And use that model to successfully predict when the worm would transition to moving forwards in the future, based on where the neural dynamics started off — and do so in an entirely new set of worms. To me, this paper is a glimpse of the future of neuroscience: a model of neural activity successfully predicting changes in behaviour in new animals. Is this what “understanding” means to you?

    In Other News…

    And 2019 brought us so much more. The journal Nature changed its entire design and layout, booting out the torturous 1500 word limit “Letters” that made up the bulk of its papers, and instead publishing all papers in its longer Article format — with the aim of improving readability. I’ll let you be the judge of whether that succeeded. Krishna Shenoy and team showed us that spike-sorting is a waste of time for understanding the activity of big groups of neurons (and also the glacial pace of publishing: the preprint was posted December 5th 2017; the paper came out July 7th 2019. Without changing a lot). Neuralink finally announced some stuff, including releasing a white paper on the technical design of neural threads and the robot that implants them, apparently written by Elon Musk himself, according to the author byline. Actually, he has form in this area.

    Then there was Ed Yong’s terrifying piece on the scandal over the depression genes that aren’t. An entire edifice of scientific research is built upon the fact that variants of a handful of specific genes (like SLC6A4) alter the risk of depression. Except that they don’t. The links were established in tiny studies using a few hundred people. As soon as you use a big enough sample of people, the link between the gene variants and depression disappears. Which should have been mind-numbingly obvious: there is no way a meaningful link between the variation of a single gene and depression could be detected with a few hundred people. For all of us, it’s yet another lesson that we continue to do science badly wrong. The moral: first work out what you can and cannot detect with the tool you’re using, and only then do the science with that tool.

    Regular readers may have noticed that 2019 also brought something of a hiatus to The Spike, the home of this very piece, with the last 6 months bringing mostly silence. That’s thanks to a combination of the commitments of running a lab, running a conference, life events, and a major project that will be announced next year… (hint: it rhymes with “ook”). But The Spike’s back catalogue is now something I’m actually quite proud of, with much to see and do. So if you fancy keeping your mind purring during the Christmas break, take a dive into The Spike’s A-Z Guide.

    Happy Holidays everyone. And good luck for 2020. We need it.

    Want more? Follow us at The Spike

    Twitter: @markdhumphries

    Appendix — how not to compute a correlation

    That figure again, to save you the scrolling. From Goriounova et al (2018). Each symbol is the average over the neurons in one subject; error bars are one standard deviation. The correlation (r value), regression (black line), and confidence interval of regression (blue shading) all appear to be taken from the symbols — i.e. the average scores.

    Remember: each data point is a patient — it represents the average length of a pyramidal neuron’s dendrites in that patient. The correlation was computed between the IQ score and these averages. That’s bad.

    Because the data-points are averages, they have an error in their estimation. In their plot, each data-point has error bars giving the standard deviation of lengths in that patient. Some of them are really big. That suggests a lot of leeway for the possible “true” average values (even though computing the average is meaningless, as already discussed above).

    At minimum, they should have computed the whole set of possible correlations, by including the full likely range of each patient’s average value. And I’m willing to bet that this set of possible correlations includes many that are indistinguishable from zero.

    The plot contains a stark clue that correlation is not the answer. Taking a single neuron from a patient means that patient’s data-point has no error bars, has no variation. (Whereas in fact that data-point is useless as it’s one neuron out of a few billion pyramidal neurons in the superficial layers). Which means the possible correlation values will vary less by taking fewer neurons per patient. Anytime you get a more reliable correlation by measuring fewer things, you know something’s gone wrong.

    The study makes a valiant effort to come up with a causal mechanism, via modelling work that shows having larger dendrites increases the speed at which spikes are made — and so could let neurons track their inputs better. Which of course assumes that is something to do with intelligence…


    2019: a lightly bamboozled review of the year in neuroscience was originally published in The Spike on Medium, where people are continuing the conversation by highlighting and responding to this story.

    in The Spike on December 19, 2019 10:29 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Our Ten Most Popular Posts Of 2019

    Woman enjoying on the hill and 2019 years while celebrating new yearBy Matthew Warren

    It’s been an eventful year at Research Digest: we’ve said goodbye to old staff members and hello to new ones; we’ve commissioned and published numerous guest posts and features, and sent out dozens of newsletters to our subscribers; and we were even finalists for a national science writing award. And through it all, we’ve been delighted that so many readers continue to turn to us to learn about the latest psychological research. So as we take stock before the Christmas break, here’s a look back at our most popular posts of the year:


    10) There Are Sex Differences In The Trajectory Of Depression Symptoms Through Adolescence, With Implications For Treatment And Prevention

    It’s well known that children experience more negative moods as they reach their teenage years. But a study published earlier this year found that the exact pattern of these changes differs between males and females.


    9) Researchers Identify Sleep As A Key Reason Why Personality Traits Predict Longevity

    Certain personality traits are associated with an increased year-on-year risk of dying — partly because people with particular traits are more or less likely to engage in behaviours like drinking or smoking. Now, it seems that differences in how much sleep people get is also an important part of the relationship between personality and longevity.


    8) Different Kinds Of Loneliness – Having Poor Quality Relationships Is Associated With Greater Distress Than Having Too Few

    We often think of loneliness as a binary construct: people are either lonely, or they’re not. But researchers have begun to disentangle different sorts of loneliness, finding that some kinds may have more detrimental effects on our mental health than others.


    7) First Study To Explore What It’s Like To Live With Avoidant Personality Disorder: “Safe When Alone, Yet Lost In Their Aloneness”

    Through a series of in-depth interviews with people with avoidant personality disorder — the first study of its kind — psychologists have developed a better understanding of what the disorder is like for those affected.


    6) There Are Some Intriguing Differences Between The USA And Japan In How Emotions Influence Health

    Feeling good in an emotional sense can end up benefitting our physical health as well. But what makes people “feel good” varies between different cultures — so, as this study demonstrated, behaviours that increase well-being and physical health in one culture may not have the same effects in another.


    5) Researchers Say Growing Up With A Troubled Or Harsh Father Can Influence Women’s Expectations Of Men, And, In Turn, Their Sexual Behaviour

    Daughters’ later relationships are particularly affected by having had a poor-quality father, this study suggested, rather than simply a father who was absent entirely.


    4) Why Do People With Depression Like Listening To Sad Music?

    It’s not because depressed people deliberately act in ways that maintain their low mood, as some researchers have controversially suggested. Instead, this study found that people with depression often choose to listen to low-energy tunes because they actually boost their feelings of happiness.


    3) Adults Who Played Pokémon Extensively In Childhood Have A Pokémon-Sensitive Region In Their Visual Cortex

    A study that involves Pokémon and brain imaging?! It’ll probably come as no surprise that this one made it into our top three posts of the year.


    2) Researchers Have Investigated “Derailment” (Feeling Disconnected From Your Past Self) As A Cause And Consequence Of Depression

    We usually have a stable sense of self, and feel that our present identity is a continuation of our past one, no matter what changes we may have gone through. But what happens when that thread of continuity is disrupted?


    1)  Study Identifies The Most Effective Mental Strategies That People Use To Get Through Aversive Challenges

    There are lots of ways people might motivate themselves to make it through that strenuous run or pre-exam cram session — but they don’t all work. Our most popular post of 2019 looked at research into the strategies that are most strongly correlated with success.

    That’s it from us for this year! Thanks for reading, and we’ll be back in January to bring you more psychology, digested.

    in The British Psychological Society - Research Digest on December 19, 2019 09:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    “It made me aware falls can happen”: responses of older adults to tailored audio-visual fall prevention messages

    Falls in older people

    Falls risk increases sharply with older age, and falls among older people are a significant global problem. However, many older people are unaware of, or underestimate, their risk of falling. This may explain why older people lack the willingness to take up existing falls-prevention strategies. Because some falls can be prevented, increased population-based efforts to influence older people’s falls-prevention behavior are warranted. To this end, our team developed three prototype TV-commercial-like audio-visual falls-prevention messages that aimed to promote falls prevention in the community in the context of a positive view of aging. It was envisaged that these messages could potentially be used as one component of a future community-wide falls-prevention campaign that could operate through television, social media, and health systems. To obtain a broad community perspective on these audio-visual messages, we organized a community World Café forum.

    The World Café approach

    A World Café is a facilitated informal series of conversations around a set of predetermined issues defined by those hosting the event. In a sense, a World Café forum follows the remit of conventional focus group processes, but the less formal café style atmosphere is thought to better encourage people to have conversations about things that deeply matter to them. The forum was conducted using the World Café’s seven principles: (i) setting context; (ii) creating a hospitable space; (iii) exploring the questions; (iv) encouraging everyone’s contribution; (v) connecting diverse perspectives; (vi) listening together for patterns; and (vii) sharing collective discoveries. Our team has used a World Café forum approach previously and because participants felt the World Café was seen as empowering, positive, and promoting respect for older people, we decided to use it again for this research study.

     

    Our new study, recently published in BMC Geriatrics, involved 38 community-dwelling older people who signed up for the forum and watched the three audio-visual messages together on large projector screens. The three prototype messages intended to explicitly convey the following messages: i) falls can and will happen to anyone, and ii) prepare yourself for preventing a fall by doing activities that you enjoy. The screening of each message was followed by a 20-minute conversation round. To stimulate conversation between participants, semi-structured, open-ended questions and prompts were used. Table facilitators collected the participants’ responses on large paper sheets. A random selection of the participants’ positive and negative responses was presented ‘live’ on the projector screens so participants could comment on and discuss what was said at other tables in the forum. After about two hours of discussion and a break during which refreshments were served, the main facilitator summarized the key responses of each conversation round, which were presented to the group on large summary paper sheets. This gave the forum participants the opportunity to provide any further input and feedback and also served as a form of member checking of the collective perspective before the forum concluded.

    Findings of the World Café forum

    Feedback of the participants of the forum on the three prototype audio-visual falls-prevention messages suggested that these had increased the participants’ falls-prevention capability and positively influenced their motivation to take falls-prevention actions. Despite these positive findings, group consensus was that a more inspirational call to action was needed. A wider variety of revised and tailored audio-visual messages, as one component of a community-wide falls-prevention campaign, could be considered in an effort to persuade older people to take decisive action to do something about their falls risk. Promoting falls prevention information should be focused in community places rather than using online resources.

    The post “It made me aware falls can happen”: responses of older adults to tailored audio-visual fall prevention messages appeared first on BMC Series blog.

    in BMC Series blog on December 19, 2019 09:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Multi-panel figures: Using GIMP to combine individual images for use in PLOS articles

    Note:    This post was written by Eric Cain, a member of PLOS’ Production Team. This is the second installment in a “Format for Success” series from this team. The first post was about  Formatting figures

    Communicating scientific findings often requires images, charts, graphs, and more to support results and conclusions. When multiple visualizations all relate to one core aspect, it may be useful to include two or more panels to demonstrate this relationship and further distinguish your findings.

    To help you submit a figure image that includes multiple panels and strengthens the presentation of your work, consider these steps to combine individual parts and still meet PLOS requirements, helping to get you on your way to a published, visually-appealing journal article.

    Decide: Evaluate the implications of combining related pieces into a greater whole

    If you have two or more images of individual parts demonstrating a larger concept, you may want to include them as a single figure image, rather than submitting the images as two or more separate figures, each with their own titles and captions.

    Align: Choose a presentation that conveys the desired arrangement

    You first want to decide the alignment of the panels in the figure image. We recommend you consider the size and shape of individual panels and what type of relationship between them that you wish to demonstrate, then determine the arrangement. Here are a few examples of common arrangements we see:

    Combine: Import and edit each of your panels using an image editing software

    To create a multi-panel figure, use a presentation program such as Microsoft PowerPoint, OpenOffice Impress, GIMP, or Keynote for Mac. Here, let’s consider how GIMP can combine panels into a single figure:

    1. Navigate to File > Open as Layers…
    2. Select each of the files containing your desired panels and click Open.
    3. Each file will open as a separate layer, one on top of the next. From the Image menu, select Canvas size… in order to resize the background layer to expand the working space for all the layers in your new figure once you have rearranged them.
    4. Choose an approximate size for the canvas, considering your desired alignment. Use the Width and Height selection boxes to set a new size for your background canvas and then click Resize.
    5. Select the first layer you wish to arrange and use the Move Tool to reposition the layer in the area of your background canvas. Click on the layer and drag it into intended location. Continue to arrange each layer as needed.
    6. Next, use the Text Tool to add labels (eg. A, B, C or 1, 2, 3). Click in the area intended for text, drag diagonally to reserve space for a text box, release, and then click inside the new text box to type in labels. Letters are recommended, as they easily distinguish between panels and avoid confusion between figure numbers that exist elsewhere.
    7. When all layers of images and text are complete, then select Crop Tool to cut out unnecessary space from your figure edges. Click in a corner at the desired edge of your figure, hold and drag the cursor to the opposite corner of the desired edge of your figure, then release. Hit Enter to crop to the selected surface area.
    8. Navigate to Image > Flatten Image to combine all layers into a single, flat image.
    9. Scale your image to fit page size and resolution requirements. Select Image > Scale Image… and use the Width and Height image size boxes to set a print size, while also considering a resolution between 300 and 600 DPI. Note that as resolution increases, print size decreases unless the image is scaled larger, and vice versa. When ready, click Scale.
    10. Lastly, export your new figure image to your computer by choosing File > Export As… and adding in the corresponding name for your new file (eg. Fig1.tif or Fig2.tif). Click Export. When prompted, choose LZW as compression type and select Export.

    Congratulations! You now have combined panels into a single image.

    A few considerations while combining panels: If your figures have numerous pictures, charts, or small text, they will render best at a resolution of 600dpi. It is preferable to scale each panel to a uniform resolution before combining, so that some panels do not appear blurry and others clear. You may also export your new figure as another file type and use PACE to convert your files to TIF. Finally, if you do not wish to combine multiple panels into a single image file, then you must break them apart into separate figures. If you proceed in this direction, renumber all figures and in-text citations accordingly so that new figure numbers and labels correspond correctly. The PLOS publishing platform and community databases such as PMC only allow for one figure file per designated figure in the article text.

    Presentation software like PowerPoint or GIMP may be new to some authors who have not previously compiled multi-panel figures for use. In order to assist working in these programs, use the steps above and also check out tutorials for GIMP, if you have further needs for special modification.

    Combining multiple, related panels into a single figure image improves the visual organization of scientific concepts that cannot fit into a single panel. This post offers insight on figure presentation options and includes a simple guide to help you to create the most effective figure image possible for your PLOS journal article.

    in The Official PLOS Blog on December 18, 2019 04:53 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Thinking About Past Generations Could Help Us Tackle Climate Change

    GettyImages-1167565215.jpg

    By Emily Reynolds

    Rhetoric around climate change often calls on us to think of future generations: if we don’t suffer the effects, then our children and our children’s children will. For some, this sense of obligation could be motivating. But for others, the distant time frame may be a barrier to truly grappling with the issue.

    Now, a new study in Personality and Social Psychology Bulletin suggests one method to get people thinking about their duty to future generations is to think about the past.

    In their new paper, Hanne Watkins from the University of Massachusetts and Geoffrey Goodwin from the University of Pennsylvania suggest that reflecting on the actions of previous generations could cause a greater sense of “intergenerational reciprocity”: thinking about past sacrifices, in other words, could make us more likely to make sacrifices ourselves. At the moment, they argue, key decision makers are faced with a dilemma: their own interests conflict with the interests of future generations. Working out how to increase this intergenerational reciprocity, therefore, could be an important way to influence positive policymaking.

    To understand the impact our understanding of the past has on our actions in the future, 200 participants were first asked to respond to a writing prompt, which encouraged them to reflect on either the sacrifices made by previous generations (“which sacrifices made by members of past generations are most important in allowing you to enjoy your current way of life?”) or simply on their fashion choices.

    Next, participants were asked to rate how grateful they felt towards past generations on a scale of one to seven, as well as rating how obligated they felt towards future generations. Finally, they rated the importance of twelve social and political issues, including environmental pollution, sustainability and global warming.

    As expected, gratitude towards past generations was significantly higher in the group asked to reflect on sacrifice. But so too was obligation to future generations, suggesting that reflection on the past really did have an impact on how people thought about what’s required to make change. However, the two groups didn’t show any differences in the perceived importance of environmental issues.

    A second study explored these findings further, asking some participants to reflect on the lack of sacrifices made by past generations. There was also an additional measure at the end of the study, with participants asked if they would be willing to give money or pay more tax to help with environmental issues. In this case, again, reflecting on sacrifice increased gratitude, though there was no significant effect on how willing someone was to give up money for the cause.

    And in a final study, some participants were asked to reflect on specific sacrifices — those made during World War II — rather than coming up with their own. Again, participants in the sacrifices condition were more likely to feel more gratitude towards past generations and also reported that the current generation was more “unworthy” and had an easier life. But in this case reflection on sacrifice did not increase obligation towards future generations in any significant sense.

    So is thinking about the sacrifices of past generations a sufficient strategy when it comes to encouraging pro-environmental behaviour? Frustratingly, it’s rather difficult to say. While the results of the studies were mixed, overall the team found it did have an impact on our sense of duty towards our descendants. But even if this strategy does increase people’s sense of obligation, this alone may not be enough to change behaviour, as results on donating money seem to indicate.

    The question of whose behaviour needs to be changed is also important — although making pro-environmental choice on a day-to-day basis may be a positive foundation for an ethical life, it is key policymakers and influential people who really need to be convinced. For these figures, many of whom have vested interests in decidedly non pro-environmental processes and institutions, shifting opinion may be a little harder.

    Reflecting on Sacrifices Made by Past Generations Increases a Sense of Obligation Towards Future Generations

    Emily Reynolds (@rey_z) is a staff writer at BPS Research Digest

    in The British Psychological Society - Research Digest on December 18, 2019 09:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Update on mobile-friendly abstract page edits

    We wanted to update the arXiv community that the new mobile-friendly edits we recently blogged about are now deployed. If you visit the abstracts page on your phone you should see the updates, including improved legibility on small screens and a new download PDF button at the top of the page. Here is a quick preview of the new mobile-friendly layout:

    mobile-friendly abstract page layout

    Thanks again to our active user testing group who took part in our recent survey to assess the proposed mobile-friendly changes. The response level was phenomenal and we highly value your time and your insights. We have catalogued all additional feedback that was outside the scope of this change and will use it to inform future work.

    Want to join our next user testing survey? Join the arXiv usability email list where opportunities to participate in polls and usability testing are announced. To join (subscribe) to the list:

    • Send an email message to arxiv-usability-testing-l-request@cornell.edu
    • For the Subject of your message, type the single word: join
    • Leave the body of the message blank.

    Thank you for helping us to improve arXiv!

    in arXiv.org blog on December 17, 2019 09:35 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Bah Humbug

    Edinburgh psychologists announce in Nature Communications genes for being rich. A Christmas Carol.

    in For Better Science on December 17, 2019 09:05 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    When Thinking About Your Personality, Your Friends’ Brain Activity Is Surprisingly Similar To Your Own

    GettyImages-1185006111.jpg

    By Emma Young

    How well do you know your best friend? New research led by Robert Chavez at the University of Oregon suggests that scans of both your brains might provide the answer. The study, published in the Journal of Personality and Social Psychology: Attitudes and Social Cognition, reveals that the brain activity patterns of people asked to think about what a mutual friend is like can be remarkably similar to those observed in that friend when they think about themselves.

    For the round-robin study, the researchers recruited 11 students aged 24-29 who were all friends and spent a lot of time together. Each of the students first rated themselves, as well as each of their friends, on a variety of personality measures, including Big Five personality traits, and self-esteem.

    Next came the brain-scanning. While their brains were imaged using fMRI, the participants were given tasks that would later allow the researchers to identify which region of their medial prefrontal cortex (mPFC) was most active when they thought about themselves. Then, while still in the scanner, the participants completed the main task, which was similar to the initial one: they had to indicate whether 48 trait adjectives (including sad, lonely, cold, lazy, trustworthy, fashionable, helpful, punctual and nice) applied to themselves and also to each of their friends.

    Using all this data, Chavez and co-author, Dylan Wagner at Ohio State University identified, for each participant, a pattern of mPFC activity that occurred when they rated themselves on these traits. They also averaged out the friends’ mPFC patterns when they thought about a specific individual. This gave them one aggregated pattern that they could compare to the individual’s own.

    The pair found distinct similarities. When the participants were thinking about one particular friend in the group — let’s call him ‘Person A’ — their aggregated activity pattern was closer to that seen in ‘Person A’ when he thought about himself than to anybody else’s “self” pattern.

    The similarity between an individual’s activity pattern and that of their friends related to the initial judgement ratings: the more closely the friends’ initial ratings matched an individual’s self-judgements, the more similar the self/friends brain activity patterns. This may be because participants with particularly close self/friend judgement ratings are better at conveying their personalities to others, the researchers suggest.

    The study does have some limitations. Notably, the sample size of 11 is small, although the researchers are at pains in the paper to explain why this was the case. Partly it related to task timing — because of the round robin design, every extra individual included in the study would have extended the time taken to test everybody else. Also, the participants were all young students in a tight-knit social group. The results may or may not extend to other age groups and types of relationships — such as groups of co-workers. It’s also worth bearing in mind that thinking about the personalities of similar friends can influence a person’s judgments of their own personality, which may perhaps have affected the results.

    Still, showing that there are these self/friend similarities in mFPC activity does point to all kinds of potentially interesting future studies. For example, one way to explore discrepancies between how a person with anxiety or depression sees themselves versus how others see them could be to look at mPFC activity. Overall, as the researches note, the results “point to a neural mechanism underlying accuracy in interpersonal perception.”

    The neural representation of self is recapitulated in the brains of friends: A round-robin fMRI study

    Emma Young (@EmmaELYoung) is a staff writer at BPS Research Digest

    in The British Psychological Society - Research Digest on December 17, 2019 09:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Timing Is Crucial For Creating Accurate Police Sketches From Eyewitness Descriptions

    GettyImages-507182748.jpg

    By Emma Young

    A witness to a crime has to describe the offender’s face in as much detail as they can before they work with a police expert to create a visual likeness — a “facial composite”, sometimes called a photo-fit, or e-fit. But the way this is typically handled in police stations could be reducing the accuracy of these images, according to a new paper published in the Journal of Experimental Psychology: Applied.

    There have been concerns that the process of describing facial features might create a so-called “verbal overshadowing” that interferes with the visual memories of the offender. Recent work had suggested that waiting half an hour before starting on the composite should allow this predicted over-shadowing to fade away, and so make for a better composite. However, the new research, led by Charity Brown at the University of Leeds, has found that in more real-world situations, a delay actually makes things worse.

    In the first of three studies, the team split 96 participants who reported being totally unfamiliar with the UK TV soap EastEnders into eight different experimental conditions. Each participant watched a brief video clip that featured two of a total of 12 actors from the programme. They were asked to focus either on the content of the conversation (to simulate the kind of incidental facial information that an eyewitness might gather when they don’t realise that they are actually witnessing a crime) or to focus on the faces.

     Either four to six hours or two days later, these participants were asked to verbally describe one of these faces — to report on the shape and colour of the eyes, nose, forehead, and so on. Then there was either a delay of half an hour, or no delay, before each participant worked with Laura Nelson at Lancashire Constabulary Headquarters on a facial composite. (Nelson had no idea which particular actors each participant had seen).

    Another group who were familiar with EastEnders were given the composite images to identify. A separate group also rated how similar each composite was to photographs of the actors.

    The results were clear. There was only one experimental condition that produced worse likenesses than the others: when the descriptions were given two days after viewing the clip and there was a 30 minute delay before work started on the composite. Unfortunately, this is also the closest to real-world conditions in police investigations, the team notes. Facial descriptions are often not collected until a few days (at least) after a crime, and it’s not uncommon for witnesses to be offered a break before starting on the composite.

    Two days after seeing a face, witnesses’ facial descriptions tend to be less full but more accurate than descriptions given four to six hours afterwards (perhaps because they’ve had more time to process which facial features were most striking, the team suggests). But a half-hour delay immediately after giving the verbal descriptions seems to then impair their access to details of their recalled descriptions that would otherwise contribute to a better facial likeness, the team writes.

    This effect held whether Nelson used the “holistic” facial likeness approach, common in the UK, in which an initial face is tweaked until it fits the witness’s memory of the offender, or the system in which a facial composite is built up from selected constituent features. (Whichever method is used, verbal descriptions are always gathered first.)

    “The results have real-world but counterintuitive implications for witnesses who construct a face 1 to 2 days after a crime,” the researchers conclude. “After having recalled a face to a practitioner, an appreciable delay (here, 30 min) should be avoided before starting face construction”.

    Reevaluating the Role of Verbalization of Faces for Composite Production: Descriptions of Offenders Matter!

    Emma Young (@EmmaELYoung) is a staff writer at BPS Research Digest

    in The British Psychological Society - Research Digest on December 16, 2019 09:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Extra: Susan Schneider, author of "Artificial You"

    This episode of Books and Ideas is an interview with Susan Schneider, author of a fascinating new book called Artificial You: AI and the Future of Your Mind. Schneider's book goes beyond the question of whether AI might become conscious to issues that might affect us on a more personal level.   I am cross posting this in the feed for Brain Science because there is an obvious overlap with the issue of consciousness, which we often discuss on Brain Science. Links and References: Announcements: Connect on Social Media:

    in Brain Science with Ginger Campbell, MD: Neuroscience for Everyone on December 15, 2019 06:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    LIBER and Communia Launch Joint Guidelines on Text and Data Mining | LIBER

    LIBER and Communia have released detailed guidelines on the implementation of the Digital Single Market Directive. LIBER has specifically worked to develop the guidance related to text and data mining, as covered in Articles 3 and 4 of the Directive.

    in Open Access Tracking Project: news on December 13, 2019 04:27 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Blue Spaces And Whale Wisdom: The Week’s Best Psychology Links

    Our weekly round-up of the best psychology coverage from elsewhere on the web

    Thinking of your sadness as a person — à la the Pixar movie Inside Out — can make you feel less sad. That’s according to a recent study which highlights the benefits of putting some distance between yourself and your emotions, reports Elle Hunt at The Guardian — though the strategy can backfire when it comes to positive emotions like happiness.


    We’ve previously written about the psychological benefits of spending time in green spaces — but what about “blue” spaces? At Undark, Jenny Roe looks at the — admittedly limited — research into the potential for water bodies to also boost our well-being.


    “Many researchers say they now see social priming not so much as a way to sway people’s unconscious behaviour, but as an object lesson in how shaky statistical methods fooled scientists into publishing irreproducible results.” At Nature, Tom Chivers takes stock of the embattled field of social priming, and asks where it can go from here.


    Our grandparents pass on all kinds of wisdom and knowledge to us — and in that respect, killer whales may not be that different.  Researchers have found that killer whales have better survival rates when their grandmas are around, reports Eva Frederick at Science, probably because the older whales have superior knowledge about where to forage for food.


    As the days get colder and darker, many of us long for a lie-in. So why don’t we just change our working hours? Research suggests that our sleep needs change in winter, writes Laurie Clarke at Wired, leaving our body clocks out of sync with the demands of school and work.


    The run-up to Christmas is apparently also peak break-up season — but how do you know whether it’s time to call it quits on your relationship? Veronica Lamarche explores the psychology of breaking up at The Conversation.


    The way that psychologists choose to test hypotheses and analyse data can profoundly affect their findings, a fact that can go some way to explaining the field’s reproducibility crisis. This has been clearly shown in a new study, in which 13 different research teams were given the same five hypotheses to test in whatever way they wanted. The results were equivocal, with evidence both for and against the hypotheses, reports Christie Aschwanden at Wired, demonstrating the pitfalls of relying on just a single study for evidence. 

    Compiled by Matthew Warren (@MattbWarren), Editor of BPS Research Digest

    in The British Psychological Society - Research Digest on December 13, 2019 10:45 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Mobile-friendly survey results

    arXiv recently held a survey to assess proposed mobile-friendly changes to the abstract page. First of all, we want to extend a huge thank you to all of the survey participants! Participation levels were very high and we really appreciate the feedback each of you provided. We value your time and your insights.

    In our survey we asked participants to review the proposed abstract page on their mobile device and then rate it’s usability against the current live page:

    arxive survey question comparing the existing abstract page to a mobile-friendly version

    The results of the survey are in, users overwhelmingly found the proposed abstract page style improved their experience on mobile devices:

    • Yes, I prefer the new style: 95.97%
    • No, I prefer the old style: 4.03%

    The new mobile-only PDF download button and greater text legibility were the two most frequently cited improvements, followed by improved content flow.

    We will push these mobile-friendly edits to the live site next week, so if you access arXiv on your phone you’ll notice some changes on the abstract page. We are also cataloging the additional insights and suggestions that users provided and future updates will continue to be informed by your feedback.

    Want to join our next user testing survey? Join the arXiv usability email list where opportunities to participate in polls and usability testing are announced.

    To join (subscribe) to the list:

    • Send an email message to arxiv-usability-testing-l-request@cornell.edu
    • For the Subject of your message, type the single word: join
    • Leave the body of the message blank.

    Thank you for helping us to improve arXiv!

    in arXiv.org blog on December 12, 2019 08:34 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Your life’s purpose. Why finding your passion is essential to maintaining brain health.

    Why finding passion important for brain health, purpose in life, longevity, Alzheimer's disease risk reduction, brain health, ageing brain

    I’ve been writing this brain health blog since 2013, and it has changed my life and my career in extraordinary ways.

    My original purpose for this blog has been to provide impeccably-researched evidence-based stories that are told in a simple, fun and compelling way. And it has certainly taken me on an extraordinary journey.

    Purpose, defined as the tendency to derive meaning from life’s experiences and to possess a sense of intentionality and goal-directedness that guides behaviour, can be quantified!

    With purpose and meaning comes positive emotions — love, compassion, and appreciation — which counteract stress and support a healthy brain throughout life.

    Blue Zones residents are members of faith communities and find meaning and purpose through spirituality. Living a meaningful life seems an unlikely addition a brain blog, but ‘purpose in life’ is a concept in neuroscience that links to robust brain and mind health.

    Purpose in life reduces risk of Alzheimer’s disease and cognitive impairment

    Drs David Bennett and Pamela Boyle the Rush Medical Centre in Chicago, published this finding in a paper in the Archives of General Psychiatry in 2010.

    The project studied more than than 900 community-dwelling older people without dementia.

    All participants underwent baseline evaluations of their purpose in life. And they were followed up over seven years to see if they went on to develop cognitive impairment or symptoms of Alzheimer’s disease.

    The study defined ‘purpose in life’ as: the psychological tendency to derive meaning from life’s experiences and to possess a sense of intentionality and goal-directedness that guides behaviour.

    To measure ‘purpose in life’ the team asked participants to rate their level of agreement from one to five, to each of the following statements:

    1. I feel good when I think of what I have done in the past and what I hope to do in the future.
    2. I live life one day at a time and do not really think about the future.
    3. I tend to focus on the present because the future nearly always brings me problems.
    4. I have a sense of direction and purpose in life.
    5. My daily activities often seem trivial and unimportant to me.
    6. I used to set goals for myself, but that now seems like a waste of time.
    7. I enjoy making plans for the future and working them to a reality.
    8. I am an active person in carrying out the plans I set for myself.
    9. Some people wander aimlessly through life, but I am not one of them.
    10. I sometimes feel as if I have done all there is to do in life.

    Scoring for the negatively worded items was flipped (e.g. Qs 5, 6 & 10) and item scores were averaged to give a total purpose in life score for each person, with higher scores indicating greater purpose in life.

    All of the scores were adjusted (a statistical technique that takes into account other factors and ‘levels the playing field’) for depressive symptoms, neuroticism, social networks, and chronic medical conditions.

    Results showed

    • In the 7 years of the study, 155 of 951 people (16.3%) developed Alzheimer’s disease. Statistical analysis showed that greater purpose in life was associated with a substantially reduced risk of Alzheimer’s disease (hazard ratio, 0.48; 95% confidence interval, 0.33-0.69; P<0.001).
    • A person with a high purpose in life score was approximately 2.4 times more likely to remain free of AD than was a person with a low purpose in life score.
    • A high purpose in life score was also linked to less ‘mild cognitive impairment’. Mild cognitive impairment is a long preclinical phase during which people may transition before they show sufficient symptoms be diagnosed with Alzheimer’s disease.
    • A high purpose in life score was also linked to a slower rate of cognitive decline in old age. And purpose in life was related to a decline in semantic memory, followed by episodic memory, then perceptual speed, and working memory.

    Purpose in life had been previously linked to positive health outcomes including :

    • better mental health
    • less depression
    • happiness
    • satisfaction
    • personal growth, self-acceptance
    • better sleep
    • longevity

    What is the biological basis of purpose in life?

    How does purpose in life protect against cognitive decline?

    This is a hard question to answer.

    The researchers state,

    The finding that purpose in life is related to longevity in older persons suggests that aspects of human flourishing—particularly the tendency to derive meaning from life’s experiences and possess a sense of intentionality and goal-directedness that guides behavior—contribute to successful aging.

    It is likely people who experience greater purpose in life are less stressed and experience more positive emotions. For example, lack of purpose in life is associated with high levels of the stress hormone cortisol, markers of inflammation, low high-density lipoprotein cholesterol levels (the ‘good’ cholesterol), and abdominal fat – all factors that associated with poor general health.

    A subsequent study is published in 2012 in the Archives of General Psychiatry reported greater purpose in life may help stave off the harmful effects of plaques and tangles associated with Alzheimer’s disease. Patricia Boyle said,

    Our study showed that people who reported greater purpose in life exhibited better cognition than those with less purpose in life even as plaques and tangles accumulated in their brains…

    These findings suggest that purpose in life protects against the harmful effects of plaques and tangles on memory and other thinking abilities. This is encouraging and suggests that engaging in meaningful and purposeful activities promotes cognitive health in old age.

    A 2019 Frontiers of Psychology review titled Something to Live for”: Experiences, Resources, and Personal Strengths in Late Adulthood  explores ‘disengagement theory’. The theory suggests a view of old age as a time of life when people step back from various commitments and social roles. But the findings of the present study highlight the desire and importance of older adults to remain active participants in society through creating opportunities for social connectedness, contribution, and belongingness.

    One of the elders interviewed for the 2019 review stated:

    To be part of a bigger group enables you to deal better with things. This is what gives meaning to our lives… We are not loners that live merely to survive; we live because we are part of society. This is what holds us, this is what I think gives life purpose and meaning…

    And another said,

    I see that there are times when I’m not focused on a specific target, and then I waste my time not doing things that are meaningful for me. And there are things that are important to me, things that I really want to do, but due to a lack of thinking ahead or planning, I postpone them or don’t do them properly… It is important for me not to waste time, not only because I think that there is a limited time to each person, but because everyone has missions to fulfill in life, and it’s a pity to postpone them. It’s not just that we are born and then die.

    How do you find your life’s purpose?

    By lovely coincidence, another wellness blogger Mark Sisson from Mark’s Daily Apple was also writing about purpose and longevity recently.

    Because he says it so well, here is his take on how to find your purpose, and I couldn’t agree more …

    …do the list making, the rational weighing, the free from brainstorming that experts suggest. Reflect on your passions, your priorities, your values, your talents and temperament. Consider where all of these can intersect with the needs you see in the circles or society around you. Talk to friends. Take a stab at writing a personal mission statement if you’re so inclined. Mull on the question while you’re washing dishes. Fill your head with the possibilities, the pros and drawbacks, the complexities and ambiguities. But then move out of cerebral mode entirely, get out of your own way, and hand the question over to your intuitive self.

    Personally, I find there’s nothing more conducive to intuitive thinking than solo time outdoors… Think the question once – and only once – as you head out “into the wild” for your mini retreat. Then forget about it for the day. Just be and do and watch and smell and head home when you’re good and ready….

    One day you’ll leave with your answer. Maybe it will come to you like a vision as you round the corner of a trail one day. Maybe it will settle in quietly, almost imperceptibly until you finally notice it’s there with you. Either way, you’ll have let your answer come forth from hours of, call it, Primal meditation. Not a bad source to tap into when you’re seeking purpose – and time away worth the health benefits all on its own.

    As I wrote in my book, it’s useful to ask yourself what’s your north star? Your ‘ikigai’. Your ‘plan de vida’? There are possibly many clever strategies to find the meaning of your life — somewhere in the nexus of passion, skillset, employment opportunity, education and service to others. William James the psychologist said in 1920,

    The deepest principle in human nature is the craving to be appreciated.

    Recently I’ve come across a simpler way.

    Over the years, I’ve taken to the ‘science careers advice’ stage with Paul Baldock, a bone biologist at Sydney’s Garvan Institute of Medical Research. We called on to share our wisdom, purpose and what we’ve learned on our career paths in science. Baldock has developed a novel formula for every decision he makes in the research lab, career, and life. He simply asks,

    Is it awesome? Does it help?


    This blog post was updated in December 2019.

    The post Your life’s purpose. Why finding your passion is essential to maintaining brain health. appeared first on Your Brain Health.

    in Yourbrainhealth on December 12, 2019 12:00 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    A professor’s mission: Translating his university’s research into societal impact

    As a champion of open scholarship, Prof. Nicol Keith is building bridges between academia and the rest of the world

    in Elsevier Connect on December 12, 2019 10:59 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    The neurobiology of the afternoon nap

    afternoon nap, midday nap, brain health, emotion, learning and memory, sleep

    One Saturday afternoon in June 2013, I found myself lying flat on my back in the middle of a shop. A robotic bed hummed and gently jiggled underneath me while I watched a presentation on a strategically placed screen just to the left of where I lay.

    After 3 or 4 minutes of humming and jiggling, the computer spat out a report listing a range of mattresses that suited my individual sleep needs — $5839 for the best match!

    Marketing ploy? Absolutely! We haven’t handed over any money yet. But I was completely won over by the Sleep 101 presentation that was shown during my ‘mattress fitting’. They had the science of sleep pretty well spot on when they stated,

    Sleep is essential to your body’s overall wellness, both physically and emotionally

    But you know that, right? I’m not going to dwell on the benefits of a good night’s sleep in this blog post, because we’re all very well aware of how terrible we feel without adequate sleep.

    Instead, I’m going to focus on my other favourite sleep issue, one that I’m a passionate devotee of — the afternoon nap!

    Professor Leon Lack, sleep scientist at Flinders University in Adelaide says,

    A brief nap can not only reduce sleepiness but also improves cognitive functioning and psychomotor performance (the brain telling the body to move). A few minutes of shut-eye also considerably enhance short-term memory and mood.

    A quick backgrounder in sleep

    Sleep is divided into two major phases of brain activity but named after the eye-ball movement …

    • rapid-eye-movement (REM)
    • non–rapid-eye-movement (NREM) sleep.

    When you first fall asleep you experience NREM sleep, and then 60 to 90 minutes later, REM sleep kicks in. During the course of a normal night, a healthy adult will experience 4 to 6 consecutive sleep cycles of REM and NREM.

    During NREM sleep, your body is able to move, but your eyes don’t; your breathing and heart rate slow and your blood pressure falls. Blood flow to the brain decreases, and electroencephalograms (EEGs – recordings of brain activity) show slowing of brain activity.

    When you cycle into REM sleep, your body becomes immobile and your eyes move about rapidly. Your blood pressure, heart rate, and breathing rate increase, and blood flow to the brain increases (and my gentlemen readers can attest to blood flow into other areas of the anatomy too). EEG activity also increases and you begin to dream.

    You also dream just as you enter NREM too.

    NREM sleep is essential for learning and memory

    Neuroscientists have been gathering evidence for some time that rats need NREM sleep to learn. Patterns of electrical activity recorded from brain cells in the hippocampus (the brain structure involved in learning and memory) when rats explored a maze for the first time were repeated again in the hippocampus during post-learning sleep.

    Scientists think that maze memories become consolidated during NREM sleep. Rats that were deprived of post-learning sleep were worse at finding their way through the maze compared to rats that slept.

    This experiment has been repeated in humans. People were invited to spend a day in the lab with Harvard sleep scientist Professor Robert Stickgold. They were trained to navigate their way through a virtual map at around lunchtime and then tucked up for a siesta immediately after. Neuroscientists monitored their brain waves via EEG and woke them if they started to fall into REM sleep. The second group of maze navigators were left to sit quietly but not nap.

    The people who napped performed much better than the non-nappers when they were retested at navigating the maze.

    Professor Robert Stickgold says,

    Sleep enhances memories.  It makes them stronger and more effective

    My ongoing afternoon siesta research has taught me that the key to making the most of the afternoon nap is to keep it short. When I feel the mid-afternoon slump coming, I give in to it! I set my iPhone for 30 minutes so I don’t fall into a deep sleep (and I assume without running my own EEG that I’m avoiding REM sleep). I never wake feeling groggy and it doesn’t interfere with my sleep that night.

    Boost your memory and mood by taking a short afternoon nap

    As it happens. In 2015 I was invited to give a TEDx talk, and chose to discuss my love for AND evidence in favour of the strategic afternoon nap.

    Is napping a friend or foe?

    In the blog above I describe the cognitive benefits of the afternoon nap. Or as I like to call it the ‘strategic nap’. Naps facilitate executive functioning, memory formation, subsequent learning, and emotional regular. However, there are studies linking frequent napping with negative outcomes, especially in older people. These are summarised in a 2017 Sleep Medicine Review.

    The review states,

    In spite of these reported benefits of naps, frequent napping has also been associated with numerous negative outcomes (eg, cognitive decline, hypertension, diabetes), particularly in older populations.

    Is there a paradox?

    One reason for the apparent bi-directional effects of napping may be the purpose of the nap. As the reviewers point out,

    the discrepancy in findings may exist because chronic napping (ie, frequent napping over the course of many months or years) could be distinct from acute napping (ie, a single nap in a well-controlled setting).

    As I mentioned, I like to refer to ‘strategic napping’  — napping for a short period of time for specific reasons. Not napping to fill in time, or napping due to boredom, recovery or poor health.

    Should afternoon naps be prescribed?

    In summary, the reviewers say,

    In healthy, young individuals, a mid-day nap is beneficial. A bout of mid-day sleep minimizes sleepiness while enhancing executive functioning. Naps also facilitate memory consolidation, subsequent learning, and emotional processing, while providing additional somatic benefits.

    In young, healthy populations who are in need of emotional or cognitive intervention, napping could be prescribed.

    In older populations, excessive napping has been linked with negative outcomes. Yet there is no direct evidence suggesting that mid-day napping is detrimental.

    Therefore, it is also premature to prescribe napping in this population. In the future, studies focusing on the link between napping and negative outcomes, as well as potential interactions with inflammatory markers, would be useful in disentangling directionality.

    There we have it again! More research is required.


    Do you ever indulge in an afternoon nap?

    Leave a comment below and tell me if you’ll be more likely or less likely to try to fit in a siesta after reading this post.


    Brooks & Lack. 2006. A brief afternoon nap following nocturnal sleep restriction: which nap duration is most recuperative? Sleep. 29(6):831-40.  Wamsley, Tucker, Payne & Stickgold. A brief nap is beneficial for human route-learning: The role of navigation experience and EEG spectral power Learn Mem. 2010 July; 17(7): 332–336http://thesciencenetwork.org/programs/waking-up-to-sleep/robert-stickgold


    This blog was updated in December 2019 to reflect current research.

    The post The neurobiology of the afternoon nap appeared first on Your Brain Health.

    in Yourbrainhealth on December 12, 2019 10:58 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Aguzzi and the Lowlifes

    The prion researcher Adriano Aguzzi used to describe his Pubpeer critics as "lowlifes", and himself as a victim of a lynch mob. But after Elisabeth Bik helped him find even more mistakes in his papers, Aguzzi changed his stance.

    in For Better Science on December 12, 2019 10:27 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    The More We See Fake News, The More Likely We Are To Share It

    gettyimages-491486796.jpg

    By Emily Reynolds

    Over the last few years, so-called “fake news” — purposefully untrue misinformation spread online — has become more and more of a concern. From extensive media coverage of the issue to government committees being set up for its investigation, fake news is at the top of the agenda — and more often than we’d like, on top of our newsfeeds.

    But how does exposure to misinformation impact the way we respond to it? A new study, published in Psychological Science, suggests that the more we see it, the more we’re likely to spread it. And considering the fact that fake news is more likely to go viral than real news, this could have worrying implications.

    Research has found that previously encountered information feels more “fluent” — in other words, we find it easier to process. This, in turn, gives it a “ring of truthfulness”, write Daniel Effron from London Business School and Medha Raj from the University of Southern California: repeated information feels true, even as we simultaneously acknowledge it’s not. And, the pair predicted, because our intuitions often drive our moral judgements, we may feel it less unethical to share frequently encountered misinformation, even if we know it’s false, simply because it has this “feeling” of truth.

    To test their hypothesis, the team surveyed 138 men and women from the US. Participants, who identified with a range of political affiliations, were first presented with six real-life fake news headlines, half of which appealed to Republicans (for example “Election Night: Hillary Was Drunk, Got Physical With Mook and Podesta”) and half to Democrats (e.g. “Pennsylvania Federal Court Grants Legal Authority to REMOVE TRUMP After Russian Meddling”).

    Participants were shown the headlines four times, each time rating how interesting, funny, or well-written they were. After a distractor task, participants were shown a message clearly stating that what they were about to see was fake, and were again shown the familiar headlines as well as six they hadn’t already seen.

    They were then asked to rate the headlines across a number of measures — how unethical or acceptable it would be to publish the headline, how likely they would be to like or share it, post a negative comment or block the person who posted it, and how accurate they felt the headline was.

    The results suggested that familiarity did have an impact. Headlines previously seen by participants were rated as less unethical to publish, and were significantly more likely to be liked and shared than new headlines; participants were also less likely to block or unfollow people who had shared previously seen fake news.

    This probably wasn’t down to misplaced belief in the news, either: participants did not rate previously seen headlines as more accurate than new ones. A second experiment, on 800 participants, found that even seeing the same headline just once before was enough to produce similar results, and a third found that asking participants to “take their time” and “deliberate” over their choices had little effect. In a final experiment, participants were told they could share headlines with others about to take part in a similar study — and again, they were more likely to actively share familiar headlines.

    With fake news proliferating on the feeds of billions of people across the world, the findings have important implications. And they could also have an impact on how we deal with fake news. Many efforts to stop fake news rely on fact checking — trying to inform readers that what they’ve seen is not true, and giving a more factual account of what’s actually going on. But with results here suggesting that knowing something is false has little impact on likelihood of sharing, new angles may need to be considered.

    It seems unlikely that we’re going to stop fake news any time soon — especially during periods of political upheaval. But understanding how — and why — we respond to it the way we do may help tackle its spread.

    Misinformation and Morality: Encountering Fake-News Headlines Makes Them Seem Less Unethical to Publish and Share

    Emily Reynolds (@rey_z) is a staff writer at BPS Research Digest

    in The British Psychological Society - Research Digest on December 12, 2019 10:20 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Age-related muscle wasting and cognitive decline linked

    by Blake Alec Miranda.

    We’re all familiar with the concept that with ageing comes a gradual loss of bone density and reduced size and strength of our muscles.

    Some loss of muscle and bone density is normal, but losing too much leads to frailty and increased risk of injury.

    We’re also familiar with the idea that ageing is associated with cognitive decline – a gradual loss of memory and slowed ‘thinking’ and reaction times.

    But here’s an idea that may be new: cognitive decline and muscle loss may be linked.

    Understanding how muscle loss and cognition influence one another is crucial towards pinpointing the mechanisms that underlie both dementia and the muscle-wasting disease of ageing, sarcopenia.

    The muscle-wasting disease ‘sarcopenia’ may have its origins in the brain

    Sarcopenia is to our muscles what osteoporosis is to our bones.

    Sarcopenia is defined by a continuous loss of skeletal muscle mass that interferes with an individual’s ability to function independently. It is the reason why older adults lose their strength as they age, and it appears to accelerate after the fourth decade of life in both men and women.

    Muscle loss is often attributed to hormonal and inflammatory processes. However, preclinical evidence links neuron death to muscle loss. As the specific neurons that innervated muscle — so-called motor neurons — degenerate, muscle, in turn, loses strength and mobility.

    University of Otago neuroscientist Associated Professor Phil Sheard, is interested in sarcopenia (and he also happens to be Sarah’s Honours thesis advisor … “Hi Phil”!!).

    Sheard is investigating the role of the neuromuscular junction because progressive withdrawal of the motor nerve from the muscle fibre seems to be a conspicuous feature of elderly muscle. He’s curious about the ‘chicken and egg’ scenario — is motor nerve withdrawal a cause of muscle fibre death, or a consequence of it.

    Sheard says,

    Muscle does nothing on its own, all muscle function is initiated by the central nervous system.

    If the drive from the CNS is taken away, then muscle falls immediately silent.

    If our ageing neurons are one reason why we losing strength with age, how can we preserve both?

    Exercise mediates neuronal death in ageing models of sarcopenia

    The elixir is exercise!

    Sheard and colleagues have published research which shows that exercise prevents age-related motor neuron death and age-related loss of skeletal muscle mass.

    Endurance exercise delays the onset or slows the progress of these degenerative processes, thereby keeping motoneurons alive and preserving neuromuscular integrity and innervation status.

    Sheard explains,

    What does exercise mean for a neuron? For upper and lower motor neurons, exercise is just a change in the number and pattern of action potentials.

    ‘Action potential’ is another term to describe neural electrical activity or neural ‘firing’. Thus, a case can be made for stimulating neurons to continually fire action potentials in order to stave off their eventual death. Sheard has demonstrated that exercise slows the degeneration of neurons in the peripheral nervous system, and believes it holds true for neurons within the brain too.

    Engaging neurons in the CNS (brain) can be done with both exercise and other forms of sensory stimulation including enriched intellectual and social experiences. We have evidence of the anti-ageing effect of social and intellectual engagement on the brain (see here). And conversely, we have evidence withdrawal from social activities is a hallmark of worsening cognitive outcomes.

    In once sense, our CNS has many tools to remain healthy and efficient well into our later years.

    Neurons and muscles benefit from being active

    It is important to clarify that while muscle loss and cognitive decline are related. The exact mechanism linking the two requires further study. For example, individuals in impeccable physical health can still develop dementia.

    Questions Sheard and others are asking include:

    • Are the processes that lead to both sarcopenia and dementia inherently related at a cellular, multifactorial level?
    • Does sarcopenia increase risk for cognitive decline by virtue of CNS decay alone, or the fact that as frail individuals lose independence, they withdraw from social and intellectual engagement?
    • Are the methods for keeping neurons healthy throughout our bodies pretty much the same, or is there more to it?

    With these closing questions, I encourage you to look into the vast conversation surrounding neuroscience and physiological outcomes. Our muscles are important, and keeping them healthy may mean saving our brains.

    As Sheard sums up,

    Muscles and neurons benefit from the same thing: being active.

    It’s difficult to disagree as emerging evidence suggests stimulation, exercise and engagement improve outcomes throughout our entire nervous system.


    Thanks to Blake Alec Miranda from the Yassa Translational Neurobiology Laboratory at the University of California, Irvine for this article on sarcopenia and cognitive decline.


     

    The post Age-related muscle wasting and cognitive decline linked appeared first on Your Brain Health.

    in Yourbrainhealth on December 12, 2019 08:31 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Building the foundation for future research through Open data, code and protocols

    This blog is part of our series on the Future of Open Science. Read previous posts here

     

    In our last post, we talked about how preregistration can help facilitate a more objective and transparent assessment of science before authors even begin their investigation. In this piece, we explore the components of that investigation that can support reproducibility by making research more easily verifiable: data, code, and protocols.

    Easily reproducible science removes barriers to advancement

    Science is said to be facing a replication crisis, but increased openness of data, code, and protocols may help turn the tide.

    Replication studies like this one have found that articles which made data, code, and protocols appropriately accessible, led to reproducible results. The availability of open data is also important to researchers when preparing their own work. More than 70% of respondents to the 2019 State of Open Data researcher survey reported that they are likely to reuse open datasets in their future research, and some rely on them to validate their own findings, increase collaboration, and avoid duplication.

    Openly sharing data, protocols and code eliminates barriers for future researchers to replicate and build upon a previous study’s findings. Shared research outputs can stimulate new fields of enquiry, and even lead to an increase in the number of papers that research projects produce and are even associated with higher citation rates for the articles they describe.

    However, while many studies confirm the efficacy of sharing practices, they also demonstrate large gaps in the accessibility of these outputs, even in journals with strong data sharing policies.

     

    Promoting increased transparency 

    The good news is, favorable attitudes toward greater transparency appear to be on the rise. More and more researchers willing to curate and/or share their data while many publishers, like PLOS, have committed to data availability policies that require authors to make their data, code, and protocols accessible for their published research and promote more open culture.

    We’ve heard from our own community that they are ready and willing to participate in increased transparency. Researchers who met with us at conferences from Neuroscience to ASM Microbe have told us more options for open data and protocols sharing is the biggest change they want to see in Open Science.

     

    SFN 2019  ASM Microbe 2019 ASHG 2019 

     

    So why isn’t sharing already the norm?

    Some data sharing concerns are inherent to the research itself: patient records or data affecting vulnerable populations that cannot be shared publicly, privately funded research may encounter intellectual property issues. But researchers indicated that major barriers to sharing data and code are lack of resources (including time, funding, and organization skills) to do so, as well as uncertainty around receiving credit for their work.

     

    Reproducibility is all of our responsibility

    … And that includes institutions and publishers who play a major role in the creation of incentives for researchers to share more of their work. A significant amount of work has gone into obtaining and analyzing the data, designing the study protocol, and testing new code. Not to mention curating and storing each of these pieces for sharing. Researchers should be recognized for the time and energy they’ve put into this work.

    External tools like protocols.io, Code Ocean, figshare and other enable researchers to document, collaborate, and share products of their research – with unique DOIs – as their work progresses. Publishers also play a key role in promoting these tools alongside developing consistent approaches to reporting transparency and providing resources and guidelines that make it easier for authors to share and get credit for their work. But there’s more we can do.

    We’re asking ourselves how we can empower researchers to make their work more open, receive recognition for their contributions, and increase the discoverability of knowledge that we can all benefit from.

    Researchers are telling us they want to see a change in how we share science. We’ll keep looking for ways to make that possible. Check back in with our Future of Open Science series next month where we’ll look at new outputs that increase the transparency of the research process.

     

    in The Official PLOS Blog on December 11, 2019 06:54 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Designing for faster R&D: competitive bidding

    A Q&A with Monica Tan, Director of User Experience at Science Exchange

    Q:

    Monica — Science Exchange’s mission is to accelerate breakthrough discovery for R&D organizations.

    Tell us a story about a particular bottleneck facing R&D, and how Science Exchange’s user experience (UX) team solved the challenge.

    Monica: 

    Getting multiple, competitive quotes for R&D projects is important to many researchers, their organizations, and many government or regulatory agencies.

    Yet, obtaining multiple quotes can be slow, especially using a traditional, “offline” outsourcing process. Science Exchange continuously strives to create online processes that outperform traditional workflows to speed up R&D — and that’s what we did for the multiple quote feature that’s built into our software.

    Q: 

    Can you walk me through what you did?

    Monica:

    As you can see from the illustration below, there are many challenges associated with the traditional workflow — it can be time-consuming, manual, imprecise, and even noncompliant. 

    R&D organizations liked using Science Exchange because of how easy it was to obtain multiple quotes, and we recently improved that process even further. We used an industry-accepted  paradigm, the “shopping cart” — familiar to most end users from their favorite B2C online shopping sites. 

    Compared to offline workflows for obtaining competitive bids for R&D services procurement, Science Exchange’s “shopping cart” experience is easy and familiar to end users, promoting efficiency and compliance.

    Our new approach allows researchers to request quotes from providers they are interested in working with, through the shopping cart. The researcher has to create only one request, which is then sent with one click to all of the providers they selected, streamlining the process greatly.

    Q: 

    How exactly does the UX Design team at Science Exchange decide and execute an optimized workflow like this?

    Monica:

    First, we talk to our existing customers to understand their needs and areas where we might improve our platform. We also create and solicit early feedback on flows, wireframes, mockups, and prototypes, so we don’t invest in suboptimal features or solutions. We partner with product management and engineering / development to prioritize the features and improvement requests we hear from our customers.

    Q:

    How do you measure success?

    Monica:

    In the case of the “multiple quotes” feature, we were optimizing an existing capability. So we compared the time required for a typical user to send a request to multiple providers using the first-generation Science Exchange UX and then using the “shopping cart” experience. Check out the results — the new UX enabled 75% faster request submission (sending RFQs to 2 providers) and 109% faster request submission (sending RFQs to 3 providers).

    Ultimately, success looks like increased usage of the Science Exchange platform, and higher levels of customer satisfaction! So far, we’ve definitely heard positive feedback from users — the day we released the new workflow, one of our enterprise R&D customers said, 

    “I know I speak for the team here at [company] when I tell you this is great news!”

    For a UX designer, customer satisfaction, as measured by happiness and success in completing an order, is my ultimate reward.

    Shopping cart UX speeds up request submission process

    compared to Gen 1 request submission workflow

    New UX enabled 75% faster request submission (sending RFQs to 2 providers) and 109% faster request submission (sending RFQs to 3 providers).

    in The Science Exchange Blog on December 11, 2019 03:02 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Elsevier chats: Double dipping and other bad manners

    Gino Ussi, Managing Director of Research Solution Sales, talks about open access and pricing at Elsevier

    in Elsevier Connect on December 11, 2019 11:33 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Research 2030 podcast: ‘Breaking up is hard to do’

    What do the technological changes in the world of research mean for traditional journals and their articles?

    in Elsevier Connect on December 11, 2019 10:11 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    ‘Three’s a crowd’: The triple impact of hearing, vision and cognitive impairment in older people

    Hearing and vision impairment are increasingly common as people age. After the age of 70 years, over two-thirds of people have a significant hearing and/or vision problems. Dementia is also strongly linked with age; more than one third of people over 90 years have dementia. Unfortunately, hearing and vision impairments are under-identified and under-treated in people with dementia. When sensory impairments co-occur with cognitive difficulties, the impact on the individual is magnified: dependency may increase, cognitive decline may be more rapid, and communication problems are greater. This increases the risk of social isolation as well of as developing delirium (confusional states) and challenging behaviors (e.g. agitation, aggression). Ultimately, the quality of life for the person with dementia declines, and the cost of care increases. Despite this, the opportunity to address hearing and vision impairment as a cost-effective way to improve outcomes for older adults is significant. Ameliorating sensory impairments to improve mental well-being is a key aim of our European Commission-funded Horizon 2020 program ‘SENSE-Cog’.

    What we did

    Our journey with SENSE-Cog started in 2016 in the response to the call for proposals under the banner of ‘Mental Wellbeing for Older People.’ Based on this, we gathered 27 investigators from across Europe, representing a range of disciplines, including hearing, vision, and cognitive health, and the fields of epidemiology, health economics, biostatistics, and clinical trial design. Together, we decided to examine the links among hearing, vision, and cognitive health, with a view to clinical applications. In our new study in BMC Geriatrics, we consulted with people living with dementia (n=18) and their care partners (n=15) in the UK, Cyprus, and France to determine the best approach to improving the lives of people with dementia with co-occurring hearing and vision problems. Several key themes emerged, including: (1) the need to improve our methods of assessment of cognition in people with hearing and vision problems and (2) the possibility of improving quality of life in dementia by improving hearing and vision health.

    The lack of valid and reliable assessment tools … was a significant unmet need.

    Challenges identified

    Our qualitative data revealed three areas of difficulty for persons with dementia (PwD) and their care partners,  illustrated here with quotations.

    First, hearing, vision, and cognitive assessments were not appropriately addressed to the complex needs of PwD and sensory comorbidity:

    I know what she got it wrong [in the assessment], I knew the ones she can’t do […], you’ve got to draw a diagram, which will be inside a box, and […] I know she can’t do it, she can’t. But I mean that’s it, I got nothing, I got nothing else. Just a score. (Care partner)

    Second, PwD commonly experienced challenges in communication and conveying unmet needs and concerns to care partners and professionals, across domains:

    It [my condition] has taught me to listen to people. If they’re talking to me, you know, really listen to what they say, which I think some, some, I don’t say generalised but some people don’t listen to what you say to them. So if I want to know I’ve got to listen. (PwD)

    Third, information about and guidance regarding support for the condition was not adequate in the assessments:

    Researcher: so there wasn’t much of an explanation?

    Care partner: […] yes more explanatory, this might have happened because there are different doctors and they are very busy…

    Researcher: neither for how to take care…

    Care partner: They have to inform us more.

    From our consultations, it was clear that the lack of valid and reliable assessment tools for assessing cognition in people with sensory impairments and for assessing hearing and vision in people with cognitive impairments was a significant unmet need.

    People at risk of or with dementia often do not report hearing and vision impairments. Likewise, people with hearing and vision problems often have difficulty completing cognitive assessment tests, since most of these tests rely on intact hearing and vision.

    Responding to unmet needs

    We then attempted to address these outstanding needs by (i) developing and validating new cognitive assessments specifically adapted for people with hearing or vision impairment and (ii) developing and evaluating a home-based ‘sensory support intervention’ to improve quality of life and other outcomes for people with dementia.

    The assessment development work centers around alternative forms of the Montreal Cognitive Assessment (MoCA) being developed and validated for people with either hearing or vision impairment. The home-based intervention involves assessment of hearing and vision linked to individualized sensory support for people with dementia, delivered by a sensory support worker. The additional support from the therapist helps with uptake, correct use, and maintenance of hearing aids and glasses. The therapist also provides advice on improving the sensory environment in people’s homes; this might involve better lighting, noise reduction, or addressing problems with acoustics. Finally, the therapist also teaches communication skills for care partners and directs the person with dementia to community supports and services, if needed. The intervention was developed with the input of several stakeholders and field-trialed in three countries. It is now being evaluated in a definitive randomized controlled clinical trial in five European countries.

    Throughout the SENSE-Cog program, the needs and perspectives of patients and their care partners have been central. The design and conduct of all of the work is informed by a network of ‘Research User Groups’ in five European cities, comprising people living with dementia or sensory impairment and their care partner(s).

    What’s next?

    SENSE-Cog will provide validated cognitive assessments adapted for hearing and vision impairment to improve the reliability of diagnoses. It will also provide an evidence base for sensory-based interventions to improve outcomes for people with dementia and inform clinical practice guidelines for clinicians and care workers in hearing, vision, and cognitive health. In addition, the SENSE-Cog project is creating momentum and interest in the worldwide research and clinical community in the growing field of sensory-cognitive health. Our future work includes translating the sensory support intervention to other settings, including low- and middle-income countries (SENSE-Cog ASIA) and care (nursing) home settings (SENSE-Cog CARE).

    The post ‘Three’s a crowd’: The triple impact of hearing, vision and cognitive impairment in older people appeared first on BMC Series blog.

    in BMC Series blog on December 11, 2019 09:00 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Why Fear Of Rejection Prevents Us From Making Wise Decisions

    GettyImages-1128189781.jpg

    By guest blogger David Robson

    When you have a disagreement with your boss, how do you respond? Do you consider that you might be at fault and try to consider the other’s viewpoint? Or do you dig in your heels and demand that other people come around to your way of thinking? In other words, do you make wise, practical decisions, or are you prone to being stubborn and petty in the face of criticism?

    As I describe in my recent book,The Intelligence Trap Cover The Intelligence Trap, a whole new field of “evidence-based wisdom” aims to measure these kinds of differences in people’s thinking and behaviour, and to understand the reasons for them. Taking inspiration from philosophers like Aristotle and Socrates, these psychologists argue that wise reasoning involves five qualities: intellectual humility (recognising the limits of your knowledge); a recognition of uncertainty about how a situation may unfold; a recognition of others’ perspectives; the use of an outsider’s viewpoint; and the search for compromise. There are now various measures of these ways of thinking, including self-report questionnaires and more involved laboratory experiments. And researchers have begun to study how wise reasoning is related to other individual differences: for instance, these measures appear to predict well-being above and beyond other cognitive traits (like IQ).

    Igor Grossmann at the University of Waterloo has been at the forefront of this work, and his latest paper, available as a preprint at PsyArxiv, examines whether people’s level of “rejection sensitivity” might determine some of the individual differences in wisdom. Working with Anna Dorfman and Harrison Oakes, he hypothesised that the threat of rejection could lead some people to become more self-defensive as they strive to protect their own ego, potentially reducing their willingness to accept their mistakes, see others’ viewpoints, and look for compromise — their capacity for wise reasoning, in other words.

    To find out if that were true, the trio conducted a series of six experiments that asked a total of 1,617 participants to consider how they would respond to various workplace conflicts, such as excessive criticism from a colleague. After they had thought through their reactions, the participants were asked to rate how much they had engaged in each of the five qualities of wise reasoning (such as intellectual humility) described above.

    They were also tested on more general “rejection sensitivity”, using a standard questionnaire. Participants were asked to rate how anxious they would feel before inviting family members to an important event, in case they refused to come, and in various other situations.

    As Grossmann’s team had hypothesised, the people who were more sensitive to rejection did indeed tend to score lower on all five of the qualities associated with wise reasoning when faced with the conflict: they reacted in a more defensive and closed-minded way, for instance, and were less intellectually humble. Interestingly, this was true even when the researchers controlled for personality traits, including neuroticism and narcissism — the fear of rejection was an independent factor in explaining the differences in wise reasoning. Given the importance of wise reasoning in resolving conflicts, these results also suggest that people who fear rejection may be less likely to find an amicable resolution to these kinds of situations.

    When designing the scenarios, the team had also tried to manipulate the power dynamics within the conflict — some participants were told they were a supervisor, for instance, while others were a subordinate. Perhaps a sense of power makes you more ego-centric, the researchers theorised, which would mean you are less wise. However, the participants’ position with the company had little effect on the overall scores of wise reasoning, though it did influence some of the individual qualities. Imagining themselves in a more subordinate position increased people’s intellectual humility, for instance, while the high-flyers were more likely to see the other’s perspective. Your boss may seem aloof and uninterested in your feelings, but perhaps they are more capable of seeing your viewpoint than you think.

    Like much of the research on evidence-based wisdom, this new finding gives us plenty of food for thought about our own actions and the dynamics of our relationships. It remains to be seen whether there are ways to overcome our sensitivity to rejection to increase our wisdom in these difficult situations. For those of us who struggle to take knocks to our confidence, old habits die hard.

    Grossmann’s previous research suggests it should be possible, though. As I’ve written for Research Digest previously, a technique called “self-distancing”, in which you describe your dilemma in the third person (“David was angry that…”), can help us to become less immersed in the feelings of threat or upset, which allows us to take a slightly more dispassionate attitude — almost as if we are talking through someone else’s problem, rather than our own. You may feel like Elmo or even Donald Trump, but that slight detachment encourages more open-minded and humble thinking, leading to higher scores on the wise reasoning tests.

    Future research may suggest other ways of dealing with the fear of rejection, to reduce self-defensive and closed-minded behaviour during conflict. For now, the best approach would be to take a deep breath and try to advise yourself as if you were your best friend.

    Rejection sensitivity hurts your open mind: Effects of rejection sensitivity and power position for wise reasoning in workplace conflicts [this study is a preprint, meaning that it has yet to be subjected to peer review and the final published version may differ from the version on which this report was based]

    Post written by David Robson (@d_a_robson) for the BPS Research Digest. David is a writer based in Barcelona and London. His first book, The Intelligence Trap, which examines our most common thinking errors and how to avoid them, was published earlier this year.

    At Research Digest we’re proud to showcase the expertise and writing talent of our community. Click here for more about our guest posts.

    in The British Psychological Society - Research Digest on December 11, 2019 08:30 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    arXiv’s Accessibility Goals

    The goal of accessibility is to provide access to as many users as possible, including those who use assistive technologies, and is a natural and important extension of arXiv’s core mission.

    We have recently posted our accessibility statement to the website that defines our thinking and roadmap for achieving an accessible arXiv. We welcome feedback from the community as we move forward on this path.

    While we continue to make accessibility updates to the existing legacy codebase, our arXiv NG project will help us achieve a new level of access. It is “born accessible”—built with accessibility in mind from the ground up—and makes use of structural, semantic HTML as the best way to support assistive technologies and beyond.

    Accessibility is good business practice in every way. It opens arXiv to a wider number of users, moves us towards semantically resilient code, clarifies content and meaning, not to mention minimizing legal risk. The accessibility statement on the website covers our goals in fuller details, but here are a few of the benefits to the arXiv community as we make progress towards accessibility and inclusion:

    1. arXiv is free and open to everyone from all over the world. An accessible website demonstrates arXiv’s commitment to open access knowledge.
    2. Accessible design offers clear and direct benefits to arXiv’s extremely diverse and international user base. Accessible design is, by its nature, flexible and allows content to faithfully render across a broad spectrum of devices, platforms, assistive technologies, and operating systems.
    3. Accessible websites, through their emphasis on core meaning and their rejection of superfluous content, greatly benefit those with a slow internet connection or limited bandwidth.
    4. Moderators and administrators are a critical part of arXiv’s longevity and success. To ensure arXiv is open to the most qualified moderators and administrators the interfaces that they use must be accessible.
    5. Of course, avoiding legal risk is critical to all organizations. Accessibility is not just a great business practice and the right thing to do, it is a legal requirement and a Cornell mandate. But when we shift our thinking from minimum compliance to focus instead on the opportunity and creative challenge of building better experiences for everyone we can create a more sustainable, user friendly platform for the entire community.

    Learn more about arXiv’s commitment to accessibility.

    in arXiv.org blog on December 10, 2019 07:28 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Good At Heart? 10 Psychology Findings That Reveal The Better Side Of Humanity

    GettyImages-1179761596.jpg

    By Matthew Warren

    Last year we published a list of ten psychology findings that reveal the worst of human nature. Research has shown us to be dogmatic and over-confident, we wrote, with a tendency to look down on minorities and assume that the downtrodden deserve their fate. Even young children take pleasure in the suffering of others, we pointed out.

    But that’s only half of the story. Every day, people around the world fight against injustices, dedicate time and resources to helping those less fortunate than them, or just perform simple acts of kindness that brighten the lives of those around them. And psychology has as much to say about this brighter side of humanity as it does the darker one. So here we explore some of the research that demonstrates just how kind and compassionate we can be.

    Kids as young as two are surprisingly selfless…

    We tend to think of toddlers as selfish creatures — but it seems that even very young children are more prosocial than we give them credit for. In one study, pairs of two-year-olds were given a set of marbles that made a nice sound when put into a box. Children this age are not exactly renowned for sharing — yet the team found that about half the time, the kids divided up the marbles fairly. Only in 19% of trials did one child selfishly take all of the toys for themselves. In a variation of the study, one of the pair started out with three marbles, while the other received just one. Even in this case, the luckier child willingly gave one of their marbles to the less fortunate kid in about a third of trials. And toddlers even seem to enjoy helping others. Two-year-olds show positive body language and emotions after doing a task which benefits someone else — in fact, helping others seems just as rewarding to the children as helping themselves.

    ..and some theories suggest that we are all, in fact, altruistic by nature

    It’s not just toddlers: we’re all predisposed to help others, according to some researchers. Even in the presence of strangers who we may never see again, we regularly act for the benefit of the group and punish those who don’t, even if it comes at a personal expense — a behaviour sometimes called “strong reciprocity”. This could explain why people often co-operate rather than act selfishly in the kinds of tasks psychologists use to study economic behaviour — and why people will even forgo some of their winnings to punish those who act unfairly. It could even account for the fact that we love to queue (and shout at those who skip the line).

    Our personalities have “light” dimensions — they just haven’t been studied as much as the dark ones

    Thousands of papers have been published on the so-called “Dark Triad” of traits: narcissism, psychopathy and Machiavellianism, all associated with undesirable behaviours like manipulation, egotism and callousness. But this focus on the undesirable side of our personalities “misrepresents the full capacities of humanity,” according to Scott Barry Kaufmann and colleagues. Earlier this year, the team published a new scale to measure what they call the “Light Triad”, made up of humanism, “Kantianism” (treating people not as a means to an end, but as an end itself), and faith in humanity. The work was exploratory and it’s still early days, but the team found that high scores on the Light Triad traits were associated with a greater quality of life — and overall, people generally scored higher on the Light Triad than the Dark Triad.

    When it comes to the environment, children have a profound sense of right and wrong

    That’s according to a 2011 study in which children were presented various scenarios in which a person displayed bad manners, performed some moral transgression like stealing, or made an environmentally harmful action, like damaging a tree. The kids rated harm to the environment as worse than bad manners, and they overwhelmingly gave “biocentric” rather than “anthropocentric” explanations for their ratings: they believed that the environment per se is worthy of respect, not just because it provides sustenance to humans.  Of course, this finding will probably come as no surprise to most of us eight years later, where children are leading a global movement to demand action against climate change.

    We place more value on someone else’s physical suffering than our own

    We all know about Stanley Milgram’s experiments, in which participants were convinced by an authority figure to administer electric shocks to another person. But not only has the classic interpretation of these studies been questioned — other work has also found that we’re really not that keen for another person to experience physical suffering. In a 2014 study, researchers gave participants cash rewards for electric shocks, with the opportunity to increase the intensity of the shock for extra money. The twist? Sometimes the participants themselves received the shock, but in other cases a stranger in another room was the victim. Surprisingly, the team found that people were much more willing to give themselves a larger shock for extra money: they needed twice the cash incentive to raise the pain level for the stranger.

    “Paying it forward” is a real thing

    Yes, it sounds just like the plot of an overly-sentimental, early-2000s film, but it turns out that when we are nice to people, they really do pass on the goodwill to others. Researchers asked a small group of employees to perform simple acts of kindness to their co-workers for four weeks. By the end of the study period, levels of morale and happiness increased amongst those who had been on the receiving end of those acts. But, importantly, these receivers also reported that they themselves had begun engaging in more positive behaviours towards others.  “Our study suggests that although everyday prosocial acts may be small, they are not insignificant,” the researchers conclude. “The benefits of prosociality do multiply, favoring not only those who give but also those who receive and observe.”

    One of our guiltiest pleasures may be motivated by feelings of empathy and compassion

    The rise of reality TV might seem like a sure sign of the worst excesses of humanity. The fact that we will spend our free time watching strangers’ relationships fall apart or people having their dreams crushed in front of an audience of millions doesn’t exactly inspire confidence in our species. And yet, according to a 2016 study, our drive to watch reality TV might have altogether more positive origins. Researchers gauged participants’ opinions on reality shows like Big Brother and MasterChef, and asked how much they would like to participate in these shows and what they’d think if a family member wanted to take part. The more that people enjoyed watching the shows, the happier they said they’d be for themselves or a loved one to participate. The authors say that the findings suggest people watch reality TV because they empathise with the contestants, not because they like to see people being humiliated — otherwise they would surely be against their loved ones taking part. Perhaps that conclusion should be taken with a pinch of salt, but at the very least the research does imply that our motivations may not always be as unpleasant as we instinctively believe.

    Although we think we’re more selfish when we’re hungry, we’re actually not

    Everyone can relate to the feeling of being “hangry”. But it turns out that even when we haven’t eaten in a while, we actually remain pretty helpful and co-operative towards others. That’s according to a 2019 paper which found that hungry people were just as likely as those who had recently eaten to contribute money to a shared pot, or agree to participate in a future study. Yet people expected others and even themselves to be less co-operative when hungry. The authors say that’s due to the “myth of self-interest”: the mistaken belief that we are all inherently selfish. And while a 2011 study famously found that judges make fewer favourable rulings as they get increasingly hungry, that finding has also since been debunked.

    Those of us who have suffered the most show the greatest compassion towards others

    Experiencing trauma can be devastating and have a range of negative repercussions. But people who have had more adverse experiences also tend to be more empathetic, according to research from Daniel Lim and David DeStono. The pair has found that those who have been through a greater number of negative experiences like disasters, bereavements and injuries display more empathy and are more willing to give to charity. That seems to be because they have a stronger belief in their ability to make a difference to others who are suffering. This isn’t to say that adversity is a positive thing but it does suggest that out of the darkness, something good can sometimes emerge.

    Finally, many “classic” studies that seem to reveal the worst of human nature have been challenged

    There are a handful of classic social psychology findings that every student learns about and which, more often than not, tell us something awful about the human condition. But the textbook account of these studies doesn’t always stand up to scrutiny. Take the Stanford Prison Experiment, in which volunteers assigned to the role of prison guards began to abuse those in the role of prisoners. Recently, psychologists have challenged this traditional account: one analysis of recordings from the experiment suggests that the researchers took a much more active role in encouraging the guards to behave in a tough way than they originally claimed, undermining the gloomy conclusion normally drawn from that study. Another example is the bystander effect, which most textbooks will illustrate using the case of Kitty Genovese except the description of that horrific case that most of us have read is not exactly correct. So while the bystander effect is undoubtedly real, and although there are plenty of terrible examples throughout history of people using their position of authority to cause harm, it’s also clear that human nature is far more nuanced than we learn about in PSYC101.

    If this post has left you feeling too warm and fuzzy, our first piece in the series will quickly fix that. Read it here: What Are We Like? 10 Psychology Findings That Reveal The Worst Of Human Nature

    Matthew Warren (@MattbWarren) is Editor of BPS Research Digest

     

    in The British Psychological Society - Research Digest on December 10, 2019 10:26 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Highlights of the BMC Series: November 2019

    BMC Psychiatry – Prevalence of problematic smartphone usage and associated mental health outcomes amongst children and young people: a systematic review, meta-analysis and GRADE of the evidence

    Since the introduction of widely available smart phones, there has been uncertainty surrounding possible association between smart phone use and declining mental health among children and young people (CYP). This follows a 68% increase in rates of self harm in the UK since 2011 when smart phones first became widely available. Recent studies has provided conflicting results with some studies finding significant interactions where others may not, suggesting it may not be smartphone use or screen time that is associated with poor mental health but rather smartphone addiction. This is characterized by a new term: Problematic Smartphone Use (PSU) which encompasses several concepts of behavioral addiction:  tolerance, withdrawal (dysphoria when the battery dies), preoccupation, neglect of other activities, subjective loss of control and continued use despite evidence of harm.

    By conducting a thorough systematic review and meta-analysis the authors of this paper found some harmful interactions between mental health and smartphone use. It showed the prevalence of PSU to range between 10%-30%, with females in the 17-19 year-old age group most likely to exhibit symptoms of PSU. The study also identified a significant association between PSU and depression, anxiety, insomnia, stress and poor education attainment as well as identifying similarities between PSU and substance abuse disorders. Like alcohol, smartphone use is socially acceptable and widely available posing a different and arguably more pressing public health problem than substances of abuse. With 1 in 4 CYP demonstrating problematic smartphone use and patterns of behavioral addiction that mirror substance use addiction, urgent policy reform is required to prevent the possible long term and widespread impact on current and future generations’ mental health and well-being.

     

    BMC Cancer – Variation in the initial assessment and investigation for ovarian cancer in symptomatic women: a systematic review of international guidelines

    With an increasing number of tests available for the diagnosis of women with symptomatic ovarian cancer and 200,000 women per year being diagnosed with ovarian cancer globally, it has become increasingly important to understand and evaluate differences in policy and guidance for testing strategy across systems. By conducting a systematic review the authors of this manuscript identified that most women diagnosed with ovarian cancer are not diagnosed until the onset of clinical symptoms. Their results also identified significant differences between international guidelines for identification of ovarian cancer. Since differences persist not only in the clinical features suggested to trigger the suspicion of ovarian cancer but also in the initial examinations and investigations these guidelines advocate.

    Significantly positive correlations are already known to exist between the national survival and primary care practitioner’s readiness to investigate or refer women presenting with possible symptoms of ovarian cancer, variation in international guidelines could be influential in the timeliness of ovarian cancer diagnosis and survival. The study found significant correlation between the complexity of tests performed and the specialty of the clinician performing the initial assessment, suggesting that in countries such as the UK, Ireland, Australia and Scandinavia, where general practitioners (GPs) act as gatekeepers to more specialist referrals, testing available for ovarian cancer was much more limited, potentially increasing the time to diagnosis. The authors conclude that documents and guidelines, especially related to healthcare system (national or private) are suggested to significantly influence the likelihood of timely ovarian cancer diagnosis.

     

    BMC Public Health – Describing associations between child maltreatment frequency and the frequency and timing of subsequent delinquent or criminal behaviors across development: variation by sex, sexual orientation, and race

    Child maltreatment has been linked to lower health, education, and income later in life, and is associated with increased engagement in delinquent or criminal behaviors. With 9 out of every 1000 children in the United States experiencing maltreatment resulting in their involvement in the child welfare system, this study aimed to explore trajectories of behavior from adolescence into early adulthood and how maltreatment during childhood may act as a predictor. Using a sample of more than 10,000 U.S. adolescents, the authors aimed to test the effect of maltreatment on behavior and the associations between them with special focus on the effect of physical, sexual, emotional and psychological abuse, exploitation or neglect perpetrated by individuals in positions of power.

    The study finds that individuals exposed to maltreatment had higher levels of violent offending, progressively increasing with continued exposure to maltreatment frequency. It also found a positive relationship between individuals identifying as LGBQ (Lesbian, Gay, Bisexual, or Queer) and the level of non-violent offending frequency as compared with non-LGBQ individuals. In addition, the study identified that the relationship between maltreatment and predicted non-violent offending is stronger for males compared to females, with teenage maltreated males having the greatest predicted non-violent offense frequency. The authors conclude that better understanding the differences in experiences between males and females may be particularly important as increasing numbers of females become engaged in the juvenile justice system. Further, they state an increased need to teach males, in particular, how understanding and identifying stress responses could decrease the need for externalizing responses.

     

    BMC Developmental Biology – Electrochemical gradients are involved in regulating cytoskeletal patterns during epithelial morphogenesis in the Drosophila ovary

    The process of oogenesis in Drosophila has become an incredibly important model system for the investigation of several aspects of cell and developmental biology, gradually becoming one of the most thoroughly and extensively studied stages in the development of the Drosophila model organism. With developmental stages of the oocyte from stem cell to egg depending on vast cellular processes and bio-electrical properties, the authors of this paper aimed to study the connections between electrochemical signals and the patterns of basal microfilaments and microtubules. By selectively inhibiting several ion-transport mechanisms, the authors were able to determine that stage-specific bioelectrical patterns are in fact simulated by changes of pHi- and Vmem-gradients that occur within the cell naturally. These changes in intracellular gradients and thus in bioelectrical properties also result in observed cytoskeletal changes during differential states of the follicle-cell epithelium (FCE), instrumental in the regulation and organisation of cell and tissue architecture.

     

    BMC Pregnancy and Childbirth – Prevalence of self-reported mental disorders in pregnancy and associations with adverse neonatal outcomes: a population-based cross-sectional study

    With 3.7 deaths per 100,000 pregnancies resulting from mental-health related causes, mental disorders are one of the most common causes of mortality and morbidity for women during the perinatal period, with 101 women dying by suicide in the UK and Ireland between 2009 and 2013 alone. This represents one in seven of all maternal deaths. However, whilst maternal suicide is the highest cause of mortality in perinatal women, it is not the only mental-health related outcome, with perinatal depression, anxiety and psychosis being other serious effects of mental disorders during the perinatal period. Previous studies have found that women with a history of severe mental illness commonly experienced a relapse or deterioration of their condition during the perinatal period, with 43% of women experiencing a relapse of a major depressive disorder and up to 50% experiencing a major bipolar disorder. Other examples of commonly observed relapses include psychosis, schizophrenia, anxiety disorders, obsessive-compulsive disorder and eating disorders. These are also further confounded by substance misuse or intimate partner violence. Declining mental health of pregnant women has also been associated with adverse neonatal outcomes, finding strong associations between mothers with depression and the lower preterm birth of the neonate. Similar interactions are observed in women with maternal schizophrenia where the child is more likely to be born with congenital abnormalities, identifying serious impacts of maternal mental health on pregnancy outcomes. This study highlighted that out of a cohort of over 140,000 pregnancies, one fifth reported a history of a mental disorder, of which 80% of women had no access to specialist perinatal mental health services. This paper emphasizes the critical importance of routine enquiry regarding the psychiatric history of women presenting to maternal services.

    BMC Energy – Call for papers: Green Energy and Smart Systems

    BMC Energy has opened its latest Thematic Series entitled “Green Energy and Smart Systems,” inviting submissions that include but are not restricted to:

    • The importance of implementing smart systems utilizing alternative and renewable energy;
    • Design and implementation of smart home energy management systems using green energy;
    • Holistic approaches in integrating green energy into smart cities;
    • The requirement for green energy in the future advancements of a smart system;
    • Application of solar and wind energy for smart home energy monitoring systems;
    • Recent advances in green energy for smart systems;
    • Efficient energy consumption systems using renewable energy;
    • Future aspects of using renewable energy in smart home systems;
    • Role of green technologies in building a smart energy system.

    With the deadline for submissions closing soon (25th January 2020), don’t hesitate to submit your latest research for the opportunity to have your submission become part of a selective and topical new special issue within BMC Energy.

    The post Highlights of the BMC Series: November 2019 appeared first on BMC Series blog.

    in BMC Series blog on December 10, 2019 10:10 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Review Commons is now LIVE

    ASAPbio and EMBO Press just launched Review Commons, a platform for high-quality, journal-independent peer review of manuscripts in the life sciences before they are submitted to a journal. 

    PLOS is part of a group of affiliate journals that have agreed to consider submissions with transferred reviews from Review Commons without restarting the review process. All of our journals within scope — PLOS Biology, PLOS Computational Biology, PLOS Genetics, PLOS ONE and PLOS Pathogens — now welcome submissions reviewed at Review Commons.

    Authors can submit preprints or unpublished manuscripts to Review Commons for expert peer review coordinated by professional editors at EMBO Press. Authors can then decide the best home for this Refereed Preprint which contains the manuscript, the reviewers’ reports plus any author responses. 

    By engaging with preprints and related initiatives like Review Commons, PLOS empowers authors to share more of their work, earlier, so that they can start receiving feedback, credit and citations sooner, in addition to making the scholarly communication process run more efficiently. Researchers in the life sciences are clearly engaged with preprints, and we believe engaging with initiatives like Review Commons puts researchers first, and helps make research more fair, equitable, and accessible for more people. By submitting to Review Commons, authors will spend less time re-submitting their papers to multiple journals and can make informed decisions more quickly, without having to start the process from scratch.

    We also support the notion that with Review Commons, reviewers can focus on reviewing for science and not on specific journal “fit” — we talk more about this idea in our post from November,  Why Engage With Preprints?.

     (We first announced our forthcoming involvement with Review Commons in September.)

    Please see the ASAPBio & EMBO  press release here.

     

    in The Official PLOS Blog on December 09, 2019 07:14 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Mindfulness for researchers: an approach for a healthier, more productive career

    Learn a step-by-step approach for practicing mindfulness as a scientist in the new Researcher Academy module

    in Elsevier Connect on December 09, 2019 02:45 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Reddy vs JBC

    Pittsburgh associate professor Raju Reddy and a colleague sue JBC over a retraction. I suggest here more Reddy papers for the chop.

    in For Better Science on December 09, 2019 12:37 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Here’s The First Evidence That Even Lizards Succumb To Optical Illusions

    Central bearded dragon on the rock

    By Emma Young

    It’s been known for centuries that we experience all kinds of optical illusions, and in the past few decades, researchers have shown that some animals, including monkeys, pigeons, and dogs, do too. Now the first ever study of this kind in reptiles has found that even the bearded dragon falls for an optical illusion that we humans succumb to.

    Perceptual illusions — subjective interpretations of physical information — are interesting to psychologists because they reveal important insights into how we construct our representations of the world. This new work, published in the Journal of Comparative Psychology, provides evidence that at least one reptile can be counted among the animals don’t simply passively process retinal signals, but actively interpret visual data, too.

    Maria Santacà, at the University of Padova and colleagues used the Delboeuf illusion in their new study. Look at the image below: both black circles are identical. But most people will under-estimate the size of the one with a bigger white background and over-estimate the size of the other, leading them to report that former looks smaller.

    illusionThe Delboeuf illusion, via Santacà et al (2019)

     

    Recent studies of capuchin monkeys showed that they make the same mistake. And in tests in which the black circles were replaced with identical portions of food, so too did chimpanzees: when given a choice, they generally opted for the circular food portion with less space around it. (In fact, people also over-estimate the size of food portions presented on small plates.)

    Reptiles, as the authors write, “were long considered to be sluggish and unintelligent; however, when tested under appropriate experimental conditions, they exhibit an impressive array of cognitive abilities.” For example, recent work has shown that both bearded dragons and red-footed tortoises can perceive similarities between 2D pictures and the objects that they represent.

    For the new study, Santacà and her colleagues tested a total of 12 bearded dragons and eight red-footed tortoises, all housed at the University of Lincoln. In place of black circles, they used circles of jelly food beloved by the two species: mango jelly for the tortoises and a kale, cucumber and mint jelly for the bearded dragons. The researchers used two different sizes of white circle, in place of plates. One bigger “plate” was 4.92 cm across, the other 1.82cm.

    First they tested whether, when given a choice between a bigger jelly circle and a smaller one, the animals would actually go for the larger one. The bearded dragons did; the tortoises were a lot more inconsistent. Then they repeatedly presented the animals with a 1.5 cm diameter circle of jelly centred on a big or a small circle. The bearded dragons consistently went for the jelly placed on the smaller “plate”. This certainly implies that they mistakenly perceived this jelly to be bigger than the other.

    The tortoises, on the other hand, didn’t show any preference for the smaller plate. But in the first stage of testing, they’d been just as likely to make a beeline for the smaller food portion as the larger one, so it’s impossible to draw any conclusions as to whether or not they might perceive the illusion too, the researchers write. (Why would the tortoises not go for the bigger portion? Perhaps because both portion sizes were already quite big, the authors suggest, also noting that none of the animals were food-deprived during the study.)

    However, this study does provide the first evidence that a reptile species perceives a visual illusion. “This indicates that, like some mammals, birds and fish, some reptiles can interpret and alter visual input related to object size,” the researchers conclude.

    In a commentary on the work, Todd Freeberg at the University of Tennessee writes: “This exciting comparative research raises the possibility of visual perceptual mechanisms that may be fairly widespread in animals.”

    Can reptiles perceive visual illusions? Delboeuf illusion in red-footed tortoise (Chelonoidis carbonaria) and bearded dragon (Pogona vitticeps)

    Emma Young (@EmmaELYoung) is a staff writer at BPS Research Digest

     

    in The British Psychological Society - Research Digest on December 09, 2019 10:06 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Updated NeuroFedora Computational Neuroscience ISO image available

    We've been working on making more software available in NeuroFedora. Neuron is now built with IV support, so models from ModelDB that use these should now be runnable using NeuroFedora.

    The Computational Neuroscience ISO image has been updated to include these improvements. After receiving some feedback, we've also added Julia and R to the image. The new version, 20191201, is available for download here. The checksum file is also provided. So please test your download for correctness before you proceed to use it.

    On a Linux system, please download both files (ISO and CHECKSUM), and run the following command in a terminal, in the directory where the files were downloaded, to verify the downloaded ISO:

    sha256sum -c Fedora-31-Comp-Neuro-20191201-1-x86_64-CHECKSUM
    

    Other updates

    In the meantime, we continue to package more software from our (rather long) list. jLEMS is now in testing; Arbor is in ready for review; Napari is being worked upon; along with a bunch of other tools. Help is always welcome. So if you'd have the software development skills required to build tools from source or would like to develop these, please get in touch with us!

    NeuroFedora at FOSDEM

    We've submitted an abstract to the Open Research Tools and Technologies Devroom at FOSDEM which will be held Brussels on February 1 and 2. There will also be general Fedora presence at the event, with a booth too. You can learn more about Fedora at FOSDEM here.

    Please get in touch with the people listed on the event page for more information on Fedora at FOSDEM. For NeuroFedora specific inquiries, you can contact major who is the NeuroFedora team member organising our presence at the event (or ping the team on our communication channels).

    Where we need help

    There's always lots to do. Here's a short list of where we need help:

    I won't list the skills that these tasks need, because we're not simply looking for people who have them already. We are looking for people who'd like to promote Open Science, and we will help them learn the skills.


    NeuroFedora is volunteer driven initiative and contributions in any form always welcome. You can get in touch with us here. We are happy to help you learn the skills needed to contribute to the project. In fact, that is one of the major goals of the initiative---to spread technical knowledge that is necessary to develop software for Neuroscience.

    in NeuroFedora blog on December 08, 2019 12:14 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Open access: Could defeat be snatched from the jaws of victory?


    When news broke early in 2019 that the University of California had walked away from licensing negotiations with the world’s largest scholarly publisher (Elsevier), a wave of triumphalism spread through the OA Twittersphere. 

    The talks had collapsed because of Elseviers failure to offer UC what it demanded: a new-style Big Deal in which the university got access to all of Elsevier’s paywalled content plus OA publishing rights for all UC authors – what UC refers to as a “Read and Publish” agreement. In addition, UC wanted Elsevier to provide this at a reduced cost. Given its size and influence, UC’s decision was hailed as “a shot heard around the academic world”. 

    The news had added piquancy coming as it did in the wake of a radical new European OA initiative called Plan S. Proposed in 2018 by a group of European funders calling themselves cOAlition S, the aim of Plan S is to make all publicly funded research open access by 2021. 

    Buoyed up by these two developments open access advocates concluded that – 17 years after the Budapest Open Access Initiative (BOAI) – the goal of universal (or near-universal) open access is finally within reach. Or as the Berkeley librarian who led the UC negotiations put it, “a tipping point” has been reached. But could defeat be snatched from the jaws of success?

    For my take on this topic please download the attached pdf

    Please note that this document is more eBook than essay. It is very long. I know, I know, people will complain, but that is what I do. 

    Any brave soul willing to give it a go but who (like me) does not like to read long documents on the screen may like to print it out as a folded book. I have long used the Blue Squirrel software ClickBook to do this. Alternatively, you can print booklets directly from word processing software like Word, and I am happy to send over a Word file to anyone who would like to do that

    Meanwhile, the eBook is available as a pdf file here.


    Rick Anderson has published a summary of and commentary on this eBook on The Scholarly Kitchen here

    A second post on The Scholarly Kitchen referencing this eBook was posted 10 days later here

    in Open and Shut? on December 08, 2019 08:08 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Cat Whisperers And Dog Listeners: The Week’s Best Psychology Links

    Our weekly round-up of the best psychology coverage from elsewhere on the web

    Why do children so readily believe in someone as ridiculous as Father Christmas — a man who flies around the world climbing down millions of chimneys, all in one night? It’s not that children are simply gullible beings, argues Rohan Kapitany at The Conversation: studies have shown that they can actually be pretty sceptical. Instead, it’s the detailed and committed actions modelled by adults — putting up trees, leaving out biscuits, hanging stockings — that seem to suggest to kids that of course this jolly red gift-giver must be real.


    We take for granted our ability to constantly perceive where our bodies are in space. But what happens when people lack this sense of “proprioception”? At Vox, Brian Resnick explores how a small number of patients are helping scientists to unravel the mysteries of our sixth sense.


    As we move away from cash — paying for purchases with cards and, increasingly, our smartphones — how does our buying behaviour change? At BBC Future, Lu-Hai Liang examines the research into the psychology of spending.


    We reported this week on a study showing that culture influences our ability to recognise dog emotions. Now researchers have examined human recognition of cat moods — and found that most of us do pretty miserably. While cats do express emotion in their faces, we’re just not that good at reading them, reports Karin Brulliard for the Washington Post unless you’re in the small minority the researchers call “cat whisperers”.


    Meanwhile, another study has found that dogs seem to understand that a word is the same even when it’s spoken by different speakers with different accents. Previously, humans were the only animal known to spontaneously filter out differences in voices in order to recognise the underlying word, writes Virginia Morell at Science


    Finally, a neuroimaging study has identified brain areas that are more active when we have nightmares, reports Russell Deeks at Science Focus. When sleeping participants felt fear during their dreams, they showed more activity in two areas involved in threat response, the insula and cingulate cortex. But that’s not all: in a second study, awake participants who had reported experiencing more nightmares during the week showed dampened insula and amygdala activity while viewing distressing images. This supports the theory that nightmares may act as a kind of “training” that helps us prepare for real-life threatening situations, the researchers say.

    Compiled by Matthew Warren (@MattbWarren), Editor of BPS Research Digest

    in The British Psychological Society - Research Digest on December 06, 2019 10:52 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Celebrating 6 Months of Published Peer Review at PLOS

     

    We recently passed our six-month anniversary of published peer review options at PLOS!!! Already you can read more than 800 PLOS articles accompanied by their complete peer review history — and that number is growing everyday. 

    To celebrate this huge step towards transparency – and to be transparent about our own findings – we want to share what we’ve learned from this process so far. 

     

    Author Response

    Between May 22 when we launched this program and November 22, we received 1,536 manuscripts which were eligible for the published peer review history option (authors needed to have submitted their manuscript after May 22 and been accepted for publication). During that time, the average author opt-in rate across all PLOS journals reached 40%.

    The wave of support for revealing the expert assessment behind their work has been even higher on PLOS Biology, PLOS Computational Biology, PLOS Genetics, and PLOS Medicine where opt-in rates during the first six months were all above 50% (PLOS Biology is now up to 72%!) 

    When we look more closely at PLOS ONE–which represents so many diverse disciplines and researcher needs–areas with the highest opt-in rates include Public Health (56%), Neuroscience (40%), Ecology (50%), and Psychology (43%).

     

    My own research is concerned with research metrics and analytics and more specifically the development of a research field… to my knowledge, sharing the entire review history is still relatively rare. I hope more researchers as authors and reviewers will adopt the practice to contribute to the transparency of scholarly communication and open science in general.

    Chaomei Chen, PLOS ONE Author, Visualizing a field of research: A methodology of systematic scientometric reviews

     

    This is just the first six months. We’re excited to see what the next year brings as more authors become familiar with the option to publish their reviews, and the community of readers who rely on PLOS find value in more and more expert opinions freely available to read, cite, and share. 

    Veronique Kiermer, PLOS Publisher and Executive Editor 

     

    Has transparent review affected reviewer behavior?

    One question posed by our community is how published peer review would impact reviewer activity. Knowing their comments could be made public, would it become more difficult for editors to find reviewers? Would it take longer to return review comments to the authors? So far, we haven’t seen significant changes in either reviewer acceptance of invitations or timing to submit their comments. Of course, our data is still preliminary, and noisy, as many other factors contribute to the reviewer experience, but we expect to learn more as we go. 

    Ultimately, we hope published peer review will demonstrate benefits to reviewers as well. Making review comments public as a citable object helps elevate the status of reviewer contributions and demystify the assessment process. When reviewers choose to sign their reviews, they can make the review process even more transparent. 

    Among the PLOS articles which have been published with review history so far, 55% also include 1 or more signed reviews. Combined with tools like ORCID for reviewers, signed and published peer review provides more opportunities for reviewers to publicly receive recognition for their work.

     

    Signed peer review provides a clear avenue for journals to recognise and support their peer reviewers…As peer review is crucial to the integrity of the scientific endeavour, formally crediting peer reviewers and working with research institutions and funding agencies to ensure that time spent on peer review is valued, will ensure researchers are able to devote their time to this important enterprise.

    Rhys Grinter, PLOS Genetics Author, Protease-associated import systems are widespread in Gram-negative bacteria

     

     

    Changing the way we see peer review

     We believe true transparency of the research process is the best way to make science more open to researchers and readers of every career stage, demographic and discipline. By opening the peer review process, we can provide more context for research evaluation and educate future generations of researchers.

     

     This publication in PLOS ONE was my first paper that underwent the peer review process, and initially I was unsure on how the peer review process worked… I hope that by publishing our peer review process, we can help demystify the entire process for fellow early career researchers in addition to shedding light on what reviewers may be looking for as they review papers. 

    Alexandra Muir, PLOS ONE Author, Differentiating electrophysiological indices of internal and external performance monitoring: Relationship with perfectionism and locus of control

    To that end, we’re also experimenting with other forms of transparent review by enabling community comments on preprints to be used in the peer review process and streamlining journal agnostic review

     Ultimately, we hope our learnings and the feedback from our community will encourage more researchers and publishers to engage in transparent review, transforming the way we read and evaluate scientific literature. Stay tuned for more updates along the way!

    in The Official PLOS Blog on December 05, 2019 04:36 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    arXiv Machine Learning Classification Guide

    We are excited to see the adoption of arXiv in the rapidly growing field of machine learning. Given the interdisciplinary nature of machine learning, it is becoming a challenge for our volunteer moderators to keep up with verifying the appropriate categories for machine learning applications.

    When submitting to arXiv, authors suggest which arXiv category they think is most appropriate. This classification determines which category the paper will be announced, and thus, the audience. Our moderators review the appropriateness of classifications in our moderation process, and misclassified papers require additional work for our volunteer moderators to rectify. This means that authors may see that their paper is delayed for announcement and publication. To help avoid this delay, we encourage you to review the arXiv categories as you prepare your submission.

    Our machine learning (category: cs.LG) moderators are seeing spikes of 250 submissions (new papers, cross lists, and replacements) per day that need attention, and we expect it to continue to rise.

    monthly count of cs.LG submissions to arXiv 2017 to 2019

    In the future we plan to use machine learning to wrangle classification during the submission process rather than the downstream moderation process. When authors go to submit papers to arXiv, we will be leveraging classifier software developed by Paul Ginsparg to suggest categories to authors. This is part of the arXiv Next Generation submission system that we plan to beta test in 2020.

    In the meantime, we are asking our authors to review the subject classes for Computer Science and consider the best fit before submitting. We also have suggestions for some of the most common decisions authors weigh.

    Is my subject most appropriate for machine learning in Computer Science (cs.LG) or in Statistics (stat.ML)?

    These two categories are very similar with overlap. Often it depends on the authors’ home field of study. In general reinforcement learning papers should have cs.LG as primary — unless they are more appropriate in math.OC (optimization and control), econ.GN (general economics), or eess.SY(systems and control). Papers categorized with cs.LG as primary are automatically cross-listed as stat.ML and vice versa.

    I am applying machine learning in a field. Which category should I choose?

    If the primary domain of the application is available as another category in arXiv and readers of that category would be the main audience, that application category should be primary. Examples include applications to computer vision (cs.CV), natural language processing (cs.CL), speech recognition (eess.AS), information retrieval (cs.IR; includes document classification, recommender systems, table extraction and recognition, and topic modeling), crowdsourcing (cs.HC), community detection and modeling (cs.SI), education, health and fairness (cs.CY), visualization (cs.GR, and, if evaluated using human studies, cs.HC), traffic management (cs.SY or cs.AI), knowledge graphs and ontologies (cs.AI), stenography (cs.MM), novel sensing techniques (cs.ET), quantitative finance (q-fin), quantitative biology (q-bio), physics (physics or cond-mat) and astrophysics (astro-ph).

    What about within neural network architectures?

    Papers discussing the foundations of neural network architectures (activation functions, spiking neurons, etc.) should list cs.NE as primary, as should papers applying biologically-inspired optimization techniques such as evolutionary methods.

    What about machine learning applied to signal types, like sound and images?

    Papers working with the properties of specific signal types (e.g., sound, EEG, hyperspectral, ultrasound) should consider cs.SD (sound, including music), eess.IV (images and video), or eess.SP (signal processing) as primary. Papers studying images of the real world should consider cs.CV as primary, whereas papers about other image-like signals (fMRI, X-rays, ultrasound, retinal, etc.) usually fit better in eess.IV. Papers that use machine learning techniques to solve specific processing tasks should consider the topic where other approaches for those processing tasks are typically published, for example eess.IV for compression or cs.CV for segmentation, independently of the type of images that are processed. Papers that use computer vision or machine learning methods to attack specific domain applications (for example, detecting diseases in medical images) should be submitted to topics that deal with that application, with cs.CV as a secondary topic. A good rule of thumb for identifying a primary topic is to ask which single community is the most important set of readers.

    What about computers assisting human learning?

    cs.CY is a better fit for papers studying human learning such as computer-aided instruction. cs.LG is appropriate as a secondary category if machine learning techniques are applied to human learning.

    in arXiv.org blog on December 05, 2019 03:07 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Allowing food companies to put nutrition claims on their products may run counter to health promotion efforts

    Nutrition claims (such as “low in fat” or “sugar free”) on food packaging may lead people to increase their consumption of those products and their overall energy intake. When unhealthy products are allowed to carry these claims it may contribute to unhealthy diets and obesity. Policy makers should therefore consider restricting the use of nutrition claims on packaged foods as part of broader efforts to improve population diets.

    What are nutrition claims?

    Nutrition claims are statements on product packaging that suggest or imply that a food has particular nutritional properties, typically in relation to the content of fat, sugar, vitamins or minerals. They can be distinguished from health claims (such as “calcium helps build strong bones”) that describe properties of a food product or food component in relation to health or disease.

    The use of nutrition and health claims varies between countries, with several jurisdictions such as Australia, New Zealand, the European Union (EU), Canada, and the United States regulating their use.

    Nutrition labeling can influence dietary choices

    As part of efforts to improve population diets and address high levels of overweight and obesity, the provision of nutrition information (e.g., through nutrition labels and nutrition claims) on food packages has increasingly become an important policy option.

    There has been extensive research on the influence of nutrition labels. This has shown that nutrition labeling can be effective in empowering consumers to choose healthier products. However, the specific role of nutrition and health claims in the prevention of overweight and obesity has not previously been clearly delineated.

    Our recent review set out to look specifically at the impact of nutrition claims relating to fat, sugar, and energy content on various aspects of food choices to understand how they contribute to efforts to prevent overweight and obesity.

    Impact of nutrition claims relating to fat, sugar, and energy content on food choices

    Our review found eleven relevant studies. Almost all of the studies were assessed as ‘low quality’, due to the nature of the experimental designs and the analysis methods used.

    Three studies explored the influence of nutrition claims on perceived healthiness of products. All of these studies found that nutrition claims on products led consumers to perceive those products as healthier. A further two studies assessed the influence of nutrition claims on ‘tastiness’. These studies found that people expected products with claims to be less tasty, even though they could not always notice the difference in taste.

    Two studies focused on perceived appropriate portion size and energy content of products with claims. These studies found that people perceived products with claims to be lower in calories. People also believed that an appropriate portion size for these products was substantially higher than for products without claims.

    Three studies examined the influence of nutrition claims on purchase intentions. One study found people were more likely to want to try products carrying ‘low fat’ claims. However, other studies found that this influence varied by product category, with claims of reduced fat on chips lowering participants’ intention to buy, and claims of reduced sugar on breakfast cereal having no effect.

    Five studies looked at the influence of nutrition claims on food consumption. All of these studies found that ‘low fat’ claims were likely to lead people to consume more of the product with the claim. Importantly, none of these studies measured the impact of claims on consumption beyond the immediate choice at hand, and so there is no available evidence of how overall daily energy intake might be affected.

    Implications of the results

    Policy makers around the world are currently considering a range of initiatives to improve population diets and address obesity. This review provides evidence to suggest that nutrition claims related to fat, sugar and energy content may lead to overconsumption of foods and subsequent higher energy intakes. Policy makers should therefore consider options to limit potential negative influences of nutrition claims.

    The use of nutrition claims to boost sales of unhealthy foods warrants closer scrutiny.

    Currently, regulation of nutrition claims varies across countries. In the EU, there are considerations of plans to incorporate specific criteria to restrict the use of nutrition claims on particular products. In Australia, health claims are restricted on products that do not meet criteria for healthiness; however, there are no restrictions on the types of products that can use nutrition claims.

    While food companies use nutrition claims as marketing opportunities, some food companies voluntarily restrict the use of nutrition claims on their products. For example, Coles, an Australian supermarket chain, has a policy to only allow nutrition claims on their own-brand products that meet certain criteria for healthiness. As more attention is focused on the social impact of companies, the use of nutrition claims to boost sales of unhealthy foods warrants closer scrutiny.

    The post Allowing food companies to put nutrition claims on their products may run counter to health promotion efforts appeared first on BMC Series blog.

    in BMC Series blog on December 05, 2019 12:33 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    An interview with BMC Public Health Section Editor, Dr Shankar Viswanathan

    Please tell us about yourself and your research interests

    I am a trained biostatistician with experience over two-decades in both methodological and public health research. My research interest includes developing new statistical methods to analyze further questions arising from medicine or public health. Specifically, I focus on survival analysis, longitudinal data analysis and missing data, and some design issues for my methods. The applied (epidemiological or public health) research interest is quite varied. Currently, I am working on studies related to the field of chronic disease epidemiology, injury epidemiology, and global public health.

    What led you to become a biostatistician? 

    I always enjoyed mathematics and majored in statistics in my undergraduate program. While I was exploring to do a graduate study, I stumbled upon the Biostatistics program. The synergy of mathematics and its application in medicine and public health made it attractive and exciting. The thought of playing a small role in changing public health towards betterment led me to become a biostatistician.

    Are you working on a particular study or project at the moment?

    I am currently involved in multiple clinical and public health studies. Still, one that is close to my heart is examining the role of inflicted traumatic brain injuries in children and its implication towards their neurodevelopmental disorders.

    How do biostatistics contribute to our understanding of public health issues?

    Biostatistics is a decision science, besides being a study of variation. Biostatistics has long played a significant role in public health in terms of identifying health issues among the population through efficient study design and optimal study size, assessing their extent and the association with exposures, prioritizing intervention or prevention strategies. Biostatistics has also played a vital role in policy advocacy, its implementation, and quality assurance processes in addition to evaluating all the above tasks. Biostatistics contribute to these processes by applying existing statistical procedures or by developing new methods to address research questions and test unique hypotheses. Biostatistics is an evolving discipline which with the current explosion of big data, would not only provide more insights but also contribute towards finding solutions for public health issues.

    What types of submissions would you like to see to the Biostatistics and methods section of BMC Public Health?

    BMC Public Health aims to present the epidemiology of disease and the understanding of all aspects of public health. I would like to see submissions in this section that align and enhance the scope.

    The types of submissions that I would like to see to the section are those that address and provide solutions to methodological challenges that evolve out of public or population health. Some examples include but are not limited to manuscripts that demonstrate the application of existing methods in an innovative way for a public health problem with new insights. Other examples are papers that compare and contrast different designs or statistical procedures that provide strategies and suggestions under the various assumptions that are relevant in the public health realm. I would also like to see submissions that propose new methods or designs illustrated with real-life data related to public health.

    About Dr Shankar Viswanathan

    Dr. Shankar Viswanathan is an Assistant Professor of Biostatistics at the Albert Einstein College of Medicine, in the Department of Epidemiology and Population Health. He received his doctoral degree in Biostatistics from the University of North Carolina at Chapel Hill. His methods research focuses on multivariate survival analysis, longitudinal data, and missing data analysis. His applied area focuses on Global Health, Injury Epidemiology, and Chronic Disease Epidemiology. He joined the Editorial Board of BMC Public Health in 2018 as an Associate Editor and became Section Editor of the ‘Biostatistics and Methods’ section of the journal this year.

     

    The post An interview with <em>BMC Public Health</em> Section Editor, Dr Shankar Viswanathan appeared first on BMC Series blog.

    in BMC Series blog on December 05, 2019 11:35 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Eureka Moments Have A “Dark Side”: They Can Make False Facts Seem True

    GettyImages-493584274.jpgBy Matthew Warren

    We love a puzzle here at Research Digest — so here’s a couple from a recent paper in Cognition. See whether you can unscramble the anagrams in the following sentences (read on for the answers!):

    The Cocos Islands are part of idnionsea

    eeebyoshn kill more people worldwide each year than all poisonous snakes combined

    If you successfully solved the anagrams, you may have experienced an “Aha!” or “Eureka” moment: a flash of insight where the solution suddenly becomes clear, perhaps after you have spent a while completely stumped. Usually when we experience these moments we have indeed arrived at the correct answer — they don’t tend to occur as much when we’ve stumbled upon an incorrect solution. And in fact, researchers have suggested that we even use Aha! moments as a quick way to judge the veracity of a solution or idea — they provide a kind of gut feeling which tells us that what has just popped into our mind is probably correct.

    But relying on these experiences to gauge the truth of an idea can sometimes backfire, according to the authors of the new paper. The team found that experiencing sudden moments of insight when deciphering a statement can make people more likely to believe that it is true — even when it isn’t.

    Ruben Laukkonen at Vrije Universiteit Amsterdam and colleagues gave participants 26 statements containing anagrams, such as those above. The participants had 20 seconds to try and solve the anagrams; if they didn’t solve them in this time then they were shown the answer. They then had to indicate whether they had experienced an Aha! moment while working out the anagram, and to judge how true the statement was. Only half of the propositions were, in fact, true (yes, honeybees do kill more people than snakes), while the other half were false (no, the Cocos Islands are not part of Indonesia – they’re an Australian territory).

    When participants successfully solved the anagram — on about 60% of trials — they rated the statements as more true than when they failed to figure it out. This effect held regardless of whether the statement was actually true or false.

    On 39% of trials, participants also experienced that satisfying Aha! moment. And in these cases, participants were even more inclined to believe the statement, giving higher ratings of truth compared to those trials where they had correctly solved the anagram but hadn’t experienced that sudden insight.

    These findings suggest that people tended to misinterpret the Aha! moment they got from solving the anagram as an indication that the statement itself was true. “People…may turn to their Aha! experiences as a shortcut in place of a lengthy and effortful review of the evidence,” the researchers write. Of course, this tendency to overgeneralise based on our feelings of insight can be a problem: it can make us more likely to incorrectly believe that that false information is true.

    And the results also imply that that people who want to actively mislead or persuade us could hijack this phenomenon. “Presentations, news articles, advertising, and other media, may seek to exploit experiences of insight as a tool of persuasion,” the team writes. Perhaps this is already happening: think, for example, about the clever ads you often see inside buses and Tube trains — the ones that contain little riddles or take a moment to “get”. It remains to be seen whether there are any particular strategies that could help us avoid falling prey to this “dark side” of Aha! moments.

    The dark side of Eureka: Artificially induced Aha moments make facts feel true

    Matthew Warren (@MattbWarren) is Editor of BPS Research Digest

     

    in The British Psychological Society - Research Digest on December 05, 2019 11:27 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    All Elsevier Connect articles

    View all articles from Elsevier Connect | Elsevier

    in Elsevier Connect on December 05, 2019 09:02 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Topics | Elsevier Connect

    Topics | Elsevier Connect

    in Elsevier Connect on December 05, 2019 08:49 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Contributors

    Elsevier | Contributor - Contributors

    in Elsevier Connect on December 05, 2019 07:59 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    “To preprint or not to preprint?” What’s the opportunity cost of early, non-peer-reviewed publicly available research?

    Note: this post was written by Sara Rouhi, Director of Strategic Partnerships for PLOS, and was originally published on NISO’s Homepage.

    NISO recently hosted a two day seminar in Washington DC entitled “Open Access: the Role and Impact of Preprint Servers,” representing stakeholders across #scholcomm discussing the perceived value, purpose, and implementation of preprint servers.

    As a novice to preprints generally (and the nuances of genesis, value proposition, and current implementations specifically), I attended to “bone up” on all things preprints and because of a preview I received a week earlier. At the Charleston Conference Hyde Park debate, Oya Rieger of Itaka S+R and Kent Anderson of Caldera Publishing solutions debated whether preprints have improved the scholarly communication system. 

    Oya’s position on the benefits of fast, early communication of results, prior to peer-review, encapsulated the “pro” arguments I’m familiar with re: preprints. Kent’s comments were new to me and when stripped of some of the agent provocateur language — “preprints disrespect peer-review” — raised many valuable points to evaluate when considering the role of preprints. 

    Now, don’t get me wrong, I ultimately come down on the side of PLOS with respect to our commitment to preprints (see an extensive overview on our blog re: current preprint pilots); however, I do think the discussion at NISO merits deeper consideration (and for the full live tweet thread, see #NISOAccess19)

    So! Do the benefits outweigh the concerns?

    Various communities have demonstrated over decades that preprints are valuable, useful, and relatively “safe” in terms of exposing the broader public to research that is not peer-reviewed. As the opening keynote to the seminar, Kent Anderson made some interesting distinctions. First, he’s not anti-preprint (although his twitter feed sometimes conveys the contrary), he’s anti-preprint server (that is, the online platforms that disseminate preprints without taking responsibility for the quality of that content and potential downstream effects of making it discoverable). For the extended unpacking, view the #NISOAccess19 hashtag.

    The argument is an interesting one: In a current information dissemination climate rife with alternative facts/misinformation and dominated by algorithms that encourage outrage (where anger = clicks = revenue), research findings — peer reviewed or not — that feed this ecosystem risk spreading like wildfire. And with closed media bubbles reinforcing belief systems held by groups, indifferent to facts, unverified, unvalidated scientific claims can be consciously manipulated to further political, social, economic, national security, etc agendas. At the very least they generate clicks, at the very worst they can instantiate beliefs that jeopardize, for example, public health.

    Dog ageing: the perils of news reporting findings on preprint servers

    And I certainly see this more and more. As an guilty-as-charged trashy news junkie, I stumble upon the infamous Daily Mail from time to time and they are continually publishing click baity articles. The most recent one I’ve seen is about dogs ageing quite differently that we have generally assumed. The research was featured in The Telegraph and The Times, which is probably how it found its way to the DM. The DM article certainly doesn’t link to the bioRxiv.org paper (here) and only mentions in passing that it’s not peer reviewed. The Telegraph mentions it as the closing line to the article. 

    When was the last time you got to the final sentence of an online article? Much less a clicky-baity one? (Note, it took me almost 10 minutes to get from the DM article to the bioRxiv article and none of the news outlets linked directly to it, just to bioRxiv.org. I had to search within the platform myself)

    Yes, non-peer-reviewed, dog age-ing research seems fairly harmless and mildly amusing. If it turns out, dogs don’t age like this, no one will die. 

    But once you get into research about vaccines, climate change, and politics, the cost/benefit calculus starts to change.

    What happens when reputable outlets do it?

    The famous case of the cell phones = brain cancer study still lingers. (I’m choosing to link to an article debunking it rather than the original paper in biorXiv, which is still available.) Reputable outlets like Mother Jones and the New York Times published the findings and now most reliable sources have to preface their reminders that there’s no real evidence for this with extensive explanations of why people think there is. American Cancer Society has to explain how cell phones work, and the National Cancer Institute, spends so much time explaining what radiofrequency radiation is that you might miss the section where they state that there’s no evidence to back these concerns.

    What about when the far right does it?

    The case of the far-right populist party, Vox, in Spain is an interesting and cautionary one. Social scientists across Spain have been raising the alarm over the Vox party’s use of research to justify their racist, xenophobic, and nationalist positions. Currently over 3000 spanish researchers have signed a petition to hold Vox accountable for their misrepresentation of scientific research.

    They state in their petition:

    As such, VOX’s strategy amounts to nothing else than a facetious resort to using (allegedly scientific) data in order to pursue an ideological agenda impregnated by extreme nationalism, intolerance, racism and xenophobia. It is an agenda that misuses, derides, and ultimately mocks the work of thousands of social scientists, at the same time as it undermines the foundations of our polity through lies and fabrications. (emphasis mine)”

    One such example of misrepresentation of social science research comes in their misrepresentation of a recent study done by the Centro de Investigaciones Sociológicos (CIS) on the Spanish public’s attitudes toward immigration. According to the Spanish newspaper, El Pais, 

    “Para sostener sus argumentos, el partido de Santiago Abascal se apoya en el CIS publicado este martes, según el cual “el 15,6% de los encuestados considera la inmigración como un problema para España”, mientras que el cambio climático “solo interesa al 1,5% de los españoles”. En realidad, cuando el CIS pregunta a los encuestados “cuál es el principal problema que existe actualmente en España”, el 10.7% cita la inmigración entre los tres primeros y solo el 2,3% menciona los problemas medioambientales; lo que les sitúa en el 8º y en el 18º puesto, respectivamente, entre las preocupaciones de los españoles.“

    “To support their arguments, Vox used data from a recent CIS survey claiming, “15.6% of respondents consider immigration a problem for Spain while only 1.5% care about climate change. In reality, CIS asked respondents, “What is the principal problem that exists right now for Spain,” and 10.7% mention immigration in their top 3 problems while only 2.3% mention environmental issues.”

    The actual CIS study can be found here.

    What does this have to do with preprints?

    Well that’s the question. If misinformation coupled with systems primed to spread it is the primary concern of those, like Kent, who are worried about research that’s not peer-reviewed entering the mainstream, isn’t this already a problem? Peer-review isn’t the mechanism that stops malign actors for misrepresenting — or entirely fabricating — the “facts” they want to promote. (if InfoWars hasn’t taught us this, then nothing has).

    When the Vox party manipulates research findings to further its messaging, it’s simply misrepresenting how respondents answered surveys. Peer-review isn’t preventing this kind of manipulation.

    Indeed, at the NISO event, numerous speakers representing publishers, funders, preprints providers, and libraries focused on how the benefits outweigh those concerns. This fantastic visual published in PLOS Biology in a 2017 paper on the benefits of preprints for Early Career Researchers (ECRs) illustrates what the various NISO speakers were identifying. 


    The NISO event further underscored these benefits. Whether it was the managing director of SSRN explaining their goal in creating SSRN — “We wanted to recreate the hallway conversations” — or the individual twitter feedback I got from ECRs during the seminar — “I’ve had lots of great feedback on preprints that have substantially improved the manuscripts,” it’s clear that the communities building preprints see their value.

    NISO speaker Thomas Narock, Assistant Professor in Integrative Data Analytics at Goucher College, noted that successful preprint servers come from communities that value them. They can’t be successful independent of a community driving them. As Angela Cochran put it on Twitter: it’s not a “if you build it, they (the community) will come” situation. Communities have to drive it.

    And the ones that drive it, clearly value it. Jessica Polka, Executive Director at ASAPBio (currently a PLOS collaborator re: preprints), closed out the NISO event by underscoring these benefits in a presentation that speaks for itself. Slides 8-10 really speak to how ASAPBio has seen engagement with preprints evolve and what authors want out of them. It’s this kind of feedback that is really driving PLOS’ own approach to preprint and open peer-review collaborations.

    The overall takeaway from the second day of seminars was thoughtfully synthesized by Jessica — let’s not get lost in the naval gazing on what a preprint is. Rather, we should focus our efforts to generate a clear “vocabulary for the full suite of peer review and screening checks that can be applied to any version of an article in the publishing continuum.”

    Much of the justified concern about the dissemination of research findings that aren’t peer-reviewed can be mitigated by using the checks and taxonomies appropriate to the field, clearing indicating the moderation strategies used by that community/server, and effective version monitoring so readers understand how the version they’re reading fits into a wider scheme community feedback.

    With organizations like NISO, ASAPBio, and PLOS modeling these kinds of standards and behaviors, the benefits of preprints seem to far outweigh the costs (notwithstanding the serious question of the financial sustainability of preprints, which I haven’t addressed at all here but is a serious hurdle to their long term viability as a vehicle for dissemination — but more on that another time).

     

    in The Official PLOS Blog on December 04, 2019 05:25 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Catherine Verfaillie, the Zombie Scientist of KU Leuven

    Catherine Verfaillie is a zombie scientist: her past stem cell research long discredited, but she still is an influential and very well funded star of Belgian science. Now Elisabeth Bik had a fresh new look at Verfaillie's papers

    in For Better Science on December 04, 2019 12:16 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    How to address the enigmas of everyday life

    Here are some hard questions: Is the value of human life absolute? Should we conform to the prevalent values? What do we owe our country? Is justice indispensable? How should we respond to evil? Is it right to forgive bad actions? Is shame good? Should we be true to who we are? Do good intentions justify bad actions? Are moral evaluations overriding?

    The questions are hard because each has reasonable but conflicting answers. When circumstances force us to face them, we are ambivalent. We realize that there are compelling reasons for both of the conflicting answers. This is not an abstract problem, but a predicament we encounter when we have to make difficult decisions whose consequences affect how we live, our relationships, and our attitude to the society in which we live.

    I consider these questions from a point of view that is practical, not theoretical; particular, not general; personal, not impersonal; and above all concrete, not abstract. The questions are asked and answered from the point of view of actual people who face actual predicaments whose resolution has implications for the rest of their lives. There are reasonable answers, but they depend on the evaluation of the relative importance of the reasons for and against the conflicting answers. Such evaluations must take into account the character, attitudes, experiences, and the possibilities and limits of the social context of those who face the questions. They vary, and that is why reasonable answers to hard questions must be personal, not theoretical, and be based on comparisons between the conflicting answers given by two people for whom a hard question has arisen in their context.

    Some examples of such comparisons are between a Japanese kamikaze pilot who gave up his life for his country and an American draftee who refused to serve his country; between a young girl who endured and survived the horrors she had to endure when she was forced into daily prostitution and humiliation that lasted for more than a year, and a priest who struggled with trying to reconcile his faith with his knowledge of what was inflicted on the girl; between how a communist commander of one of the Gulags thought about the inmates’ hard labor in subzero temperature, on starvation diet, and the brink of death caused by the terrible conditions he imposed on them, and a high-ranking religious SS officer who knew about the exterminations and did what he could to save lives; between the contrary responses of two women one of whom accepted her society’s standards and felt shame because she was shamed by them, while the other defiantly refused to feel shame even though she was shamed by her society’s standards; between a man who valued social harmony more than justice and a woman who gave up her life for the sake of justice; and between an honorable soldier who killed himself when circumstances made it impossible for him to remain true to himself and a young man who was deliberately false to himself by rejecting the past that made him who he was.

    These comparisons reconstruct the predicaments faced by the protagonists, trace the reasons that have led them to arrive at the difficult life-changing answer each gave, and enable us to learn from them as we struggle with hard questions as they arise in our own lives. In discussing them, I rely on the resources of anthropology, history, literature, politics, and religion. They make concrete the predicaments of people who had to face hard questions and how they answered them, some more and others less reasonably. The practical, particular, personal, concrete, and comparative nature of these examples exemplifies an approach to ethics that is an alternative to the theoretical, general, impersonal, and abstract approach that characterizes the mainstream of contemporary ethical thought. I attempt to show that ethics can be done in a way that reasonably and concretely addresses the vital concerns of reflective people.

    Featured image credit: “concrete” by Valentin Lacoste. CC0 via Unsplash.

    The post How to address the enigmas of everyday life appeared first on OUPblog.

    in OUPblog - Psychology and Neuroscience on December 04, 2019 10:30 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Lying To Your Kids Could Make Them More Dishonest And Less Well-Adjusted As Adults

    GettyImages-883727062.jpg

    By Emily Reynolds

    Telling white lies to children can be somewhat par for the course when you’re a parent: “I’ve got Santa on the phone and he says he’s not coming unless you go to bed now,” is particularly useful during the festive season, for example.

    It can seem like nothing: just another tool to improve your child’s behaviour. But don’t get too attached to the technique — telling too many white lies to your children may have more far-reaching consequences than you might have hoped, according to a new study, published in the Journal of Experimental Child Psychology.

    To examine the impact of parental lying, Peipei Setoh from Nanyang Technological University, Singapore, and colleagues gave 379 Singaporean adults four online questionnaires. First, in the “parenting by lying” questionnaire, participants were asked to recall whether or not their parents had told them various lies during childhood. This 16-item test covers four categories of lies: lies about food, lies about leaving or staying (e.g. “if you don’t come with me now I’ll leave you here”), lies about misbehaviour and lies about spending money.

    Next, the participants filled in the “lying to parents” questionnaire, indicating how frequently they themselves now lie to their parents as adults. Three categories were examined: lies concerning “activities and actions” they had taken part in, such as the details of relationships or friendships, “prosocial lies”, in which they had lied to benefit others, and exaggerations about events.

    Finally, they took part in a longer questionnaire, which included questions on psychological and social dysfunctions such as problems with thought, attention, aggression, and rule-breaking. They also completed the Levenson self-report psychopathy test, which examines psychopathic traits such as selfishness and impulsivity.

    The results suggested that those whose parents had lied more were now more likely to lie to their own parents — by being lied to, in other words, it seemed they had started to believe that being dishonest was morally acceptable. Parental dishonesty may also have eroded trust, the team suggests: by being lied to, children stop trusting their parents and would therefore be less likely to feel obligated to tell them the truth. Participants who were lied to more frequently in childhood were also more likely to have higher levels of maladjustment as adults, particularly when it came to “externalising” problems like aggression.

    There are, however, questions about whether the causal inference is as straightforward as it seems. If parents are constantly lying to their children, for example, there may potentially be other underlying relational issues contributing to problems in adolescence and adulthood. Yes, misleading children might not help their development, but there may also be deeper problems that are responsible for their difficulties with attention or behaviour. Participants were also being asked to recall childhood experiences, of which they may have limited memory; subsequent family rifts, deaths, or estrangements may also have impacted their view.

    Nevertheless, next time you think about telling what you see as a harmless white lie to keep your child quiet or get them into bed, think again. It may save you some time — but, in the long run, it’s probably not worth it.

    Parenting by lying in childhood is associated with negative developmental outcomes in adulthood

    Emily Reynolds (@rey_z) is a staff writer at BPS Research Digest

    in The British Psychological Society - Research Digest on December 04, 2019 09:29 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Your Research Makes a Difference

     

    Last month, PLOS ONE and PLOS Biology showcased some of the most impactful research published from 2015 – 18 in two new collections. Looking at this body of work, we’re reminded of how science builds the foundation for our understanding of the world around us. Medical research influences public health and professional practice, earth sciences inform how we understand and react to the changing climate, and the list goes on. 

    Each new discovery has a ripple effect on the future of research and society itself. How we measure that effect, and all the factors that contribute to it is understandably complicated. Knowing how best to tell the story of your impact can help ensure recognition for your work.  

     

    Tracking digital footprints through article-level-metrics

    Citations help us understand and track how a particular research article has helped other researchers build off of existing data and theories to continue investigation in the field. But citations take time, and habits differ among disciplines making. Counting citations alone won’t always capture the way an article has affected new research or policies and professional practice beyond the sphere of academia. To do that, we need to take a broader view and more holistic approach to understanding research.

    With more and more research activity happening through online sources, it’s possible to track the conversation in different ways. In addition to citations, article-level-metrics help authors keep track of different channels of influence specific to a particular article. Rather than relying on a single number, authors can share a more comprehensive story of their impact that captures the nuance of how other audiences engage with research.

    In fields of high interdisciplinarity and broad public impact, social media mentions capture the conversation. Views and downloads, particularly in areas with a need for immediate information (eg disease outbreak, climate change) also provide a sampling of a work’s broad reach. 

     

    Connecting to a broad audience: How you tell your story matters

    Open Access makes it possible for work to reach a global audience, immediately and without barriers or paywalls. That means your research has the potential to be read by everyone, not just the scientific community. If you want to reach a broader audience, engage more people, and boost the visibility of your work; how you tell your research story matters.

    Media coverage is one way for authors to boost the impact of their work and also form a key connection to readers, researchers and policymakers. The partnership between science and journalism is an important means of helping a broader audience understand and sift through the science that is happening every day so that it’s teachings can begin to be used in everyday practice.

    If you have the opportunity to connect with media partners before or after your work publishes – do it! But you can also give more context to your work through your own channels. PLOS provides tips for promoting your own work through your institutional network, social media channels, and more here. Track the effect your work has through your article-level-metrics, or just take a moment to join a conversation about how research can inspire people. 

     

    The bottom line: your research makes a difference. When you engage in the conversation, you have the opportunity to add even more context and reach new audiences. If you need extra motivation, just take a look at what our authors and editors had to say about the featured work on EveryONE  and Biologue.

     

    in The Official PLOS Blog on December 03, 2019 09:46 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    4 authors on how – and why – to share your lab hardware designs

    By publishing their designs for lab equipment in HardwareX, these young researchers hope to share cost savings – and one-of-a-kind devices

    in Elsevier Connect on December 03, 2019 04:02 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    The Psychological Impacts Of Poverty, Digested

    street of long abandoned and derelict collapsing houses and commercial buildings

    By Emma Young

    For a “rich” country, by global standards, the UK has an awful lot of people who are not. Fourteen million people — one fifth of the population — live in poverty. Of these, four million are more than 50% below the poverty line, and 1.5 million are classed as destitute, unable to afford even basic life essentials.

    For children who grow up in poverty, there are impacts that go way beyond the fact of material shortages. “Children experience poverty as an environment that is damaging to their mental, physical, emotional and spiritual development,” notes UNICEF. Clearly, there’s a critical role for psychological research in this area, first in revealing just what poverty does to children and adults — but also in developing strategies to ameliorate those impacts.

    The psychological effects on children of growing up poor do make for grim reading. A 2009 study published in the Journal of Cognitive Neuroscience, of 9- and 10-year-olds who differed only in their socioeconomic status, found striking differences in activity in the prefrontal cortex, which is critical for complex cognition. The PFC response of many of the poor children in response to various tests resembled that of some stroke victims. “Kids from lower socioeconomic levels show brain physiology patterns similar to someone who actually had had damage in the frontal lobe as an adult,” commented lead researcher, Robert Knight, professor of psychology at the University of California, Berkeley.

    The kinds of deficits that the team observed could cause problems with self-regulation and behavioural difficulties (both of which have been documented among poorer children), as well as difficulties with reasoning. “This is a wake-up call,” Knight went on. “It’s not just that these kids are poor and more likely to have health problems, but they might actually not be getting full brain development from the stressful and relatively impoverished environment associated with lower socioeconomic status: fewer books, less reading, fewer games, fewer visits to museums.”

    Since then, plenty of other studies have found that poverty harms children’s brains. In 2014, experiments led by Michele Tine revealed clear deficits in both verbal and visuospatial memory among poor children. A year later, a paper published in JAMA Paediatrics documented “irregular brain development” in low income children, and tied these lags in the development of the frontal and temporal lobes to substantially lower scores on maths and reading tests. The development of the hippocampus, which is involved in memory, was particularly influenced by the stresses experienced by these children, the team found.

    These impacts can be long-lasting. A longitudinal study, published by Gary Evans in PNAS in 2016, found that adults who were poor as children showed memory deficits and experienced greater psychological distress. In 2019, meanwhile, a long-term study of nearly 4,000 families in Canada, led by Paul Hastings, reported that growing up in a poor urban neighbourhood is associated with a doubling in the risk of developing a psychosis-spectrum disorder by middle adulthood.

    These studies paint a very bleak picture of the effects of poverty. But not all poor children are affected in the same way not every impoverished child in the prefrontal cortex functioning study showed deficits, for example. This suggested that there are also protective factors. Further research in this area suggests that overall stress levels and also the behaviour of the people close the child can make a big difference.

    Something as simple as planting more trees in schools in disadvantaged areas might help, according to research by a team at the University of Illinois, published in 2018. Ming Kuo and her colleagues quantified the level of tree and grass cover in the schoolyards of 318 elementary schools (in which 87 per cent of kids overall fell into a low family income category) and found a correlation with scores in both maths and reading: the greater the number of trees, the better the results. Based partly on other work finding a link between the abundance of trees and academic performance outside a low-income setting, Kuo thinks there is a meaningful link between the two. “It’s not a surprise to anyone that if you don’t provide air conditioning or heating in a school then maybe the kids aren’t going to do as well. But this is the first time we’ve begun to suspect that the lack of landscaping, such as trees, may help explain, in part, their poorer test scores,” she says.

    A British study published the same year supported these conclusions. This study, of 4,758 11-year-olds living in urban areas of England, found that children who lived in greener neighbourhoods performed better on tests of spatial working memory (an effect that held for both deprived and non-deprived neighbourhoods). “Our findings suggest a positive role of greenspace in cognitive functioning,” commented researcher Eirini Flouri at University College London. What might this role be? Perhaps because it’s restful for the brain, and restores the ability to concentrate.

    Interventions that focus on the families of kids growing up in poverty should also help. The team that observed the PFC deficits thinks that in theory they could be prevented, or eliminated. Earlier work has shown that children in poor families hear about 30 million fewer words by the time they they are four than children from middle-class families. Just talking more to kids can boost prefrontal cortex performance, the team notes so, they say, changing the developmental outcomes that they observed might involve something as simple as emphasising to all parents the importance of talking to their kids.

    Children raised in low socioeconomic status families also tend to go on to have relatively high rates of chronic illness in adulthood but again, this isn’t inevitable. A nurturing, attentive and emotionally-supportive mother can buffer the impacts of poverty on physical health, finds a 2011 study led by G.E. Miller and published in Psychological Science.

    There’s other work, from the University of Liverpool’s Sophie Wickham in 2014, suggesting that a person’s perceptions of their levels of stress, trust and social support mediate the impact of poverty on rates of depression and paranoia. This highlights a potential role for the broader community in mitigating the effects of poverty on individuals within that community. The research, conducted in two areas of Birmingham, suggested that community resilience to hardships, like joblessness and low income, can be enhanced, and that this primarily relies on relationships “not just between members of the community, but also between organisations, specifically between the voluntary sector, the local economy and the public sector.”

    Of course, the self-evident way to tackle the negative psychological impacts of poverty is to tackle poverty itself.

    Relaxing financial strains can make a real difference, according to research published in PNAS last year. This work on low income people who were categorised as being “chronically indebted” found that a one-off debt relief programme (funded by a charity) eased the participants’ anxiety, and improved their cognitive functioning, allowing them to make better financial decisions three months later. Thinking and worrying about un-payable debts is so mentally demanding that it contributes to the poverty trap, the researchers argued. “Our study shows that because debt impairs psychological functioning and decision-making, it would be extremely challenging for even the motivated and talented to escape poverty,” commented Ong Qiyan at the National University of Singapore. “Instead, the poor must either have exceptional qualities or be exceptionally lucky to get out of poverty. It is hard to be poor, harder than we thought.”

    People who are not poor and have debts simply do not experience the same mental drain, the team comments. “The findings in this study opens a pragmatic case for designing good debt relief programmes for low income households,” Ong argues. Even if debt can’t be written off, streamlining people’s debts, so that mentally they are easier to manage, could help, the team writes.

    In 2015, the UK government committed to achieving, by 2030, the UN’s Sustainable Development Goals (SDG), which include, at number one, “no poverty”, with “zero hunger” at number two.  However, in 2019, the House of Commons’ Environmental Audit Committee issued a report finding that food insecurity is “significant and growing” in the UK, with levels among the worst in Europe, especially for children. One in ten households in England, Wales and Northern Ireland have “low or very low food security” with a further 10% classed as “marginally food secure”. Food bank use is up between 1 April 2018 and 31 March 2019, the Trussell Trust distributed 1.6 million emergency three-day food supplies to people across the UK, a 19% increase on the previous year,  and more than half a million of these went to children.

    These are desperate statistics. Providing more greenspace around schools in deprived areas and providing support programmes for low SES mothers, for example, may work to ameliorate the effects of poverty on children. But for a child whose family can’t afford to feed them, it’s hard not to wonder what difference such measures could really make. One third of the way into the UK’s 15-year timeframe for meeting the UN SDGs of No Poverty and No Hunger, we have an extraordinarily long way to go.

    Emma Young (@EmmaELYoung) is a staff writer at BPS Research Digest

    The British Psychological Society’s 2020 priority is “From poverty to flourishing

    in The British Psychological Society - Research Digest on December 03, 2019 09:41 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Get to Know an Academic Editor: Dr. Annie Angers

    Note: we are attending the ASCB/EMBO annual meeting in Washington, DC from Dec 7-11. Stop by booth #708 and say hi to our very own Philip Mills. On December 9th, Executive Editor, Veronique Kiermer will join ASAPbio’s Jessica Polka at a Review Commons event.  You can register here

    Dr. Annie Angers is in the Département de sciences biologiques at the Université de Montréal

    Q: Can you tell us a little about yourself?

    A: I am a professor in the department of biological sciences at the University of Montréal, Canada. I teach molecular and cell biology at all levels. My lab focuses on receptor internalization and inter-organelle trafficking in different cell types.

    Q: How many years have you been an editor on PLOS ONE?

    A: About two years.

    Q: Why is PLOS ONE important to you and the community

    A: The publishing criteria. Publishing good science and well-designed experiments in a wide variety of field makes PlosOne a very attractive journal.

    Q: What is your area of study and why is it important?

    A: I study mostly cellular physiology, focusing on inter organelle exchanges. I think this area is important because it is opening all kinds of new findings and challenges some of our current views of the cell. We are also expanding our work to non-model organisms that reveal that nature diversity is not only present at the species levels, but also at the cellular and molecular levels.

    Q: What first drew you into the field?

    A: I was first attracted to the field by the findings that internalized receptors continued to signal and had access to a whole new range of effector molecules once at the endosomes. Since then, our molecular toolbox has expanded tremendously, and there is no limit to the questions that we can expect to answer too.

    Q: Are there any trends in your field right now?

    A: To me, the most notable trend is revisiting very fundamental questions with molecular and imaging tools that we could only dream of 15-20 years ago.

    Q: Why do you believe in Open Science?

    A: Open Science is only common sense in the academic world of publicly funded research. From day one as a graduate student you learn that no matter how good your ideas, they are useless if you don’t share them with the world. The best science is of no use if other researchers don’t have access to it. Therefore, putting your research behind a paywall makes no sense at all.

     

    in The Official PLOS Blog on December 02, 2019 05:26 PM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Our Ability To Recognise Dogs’ Emotions Is Shaped By Our Cultural Upbringing

    GettyImages-990077336.jpgBy Emily Reynolds

    As anyone who’s ever had to scold their dog for stealing food off a plate or jumping onto that oh-so-tempting forbidden sofa can attest, dogs are pretty good at understanding what we’re saying to them — at least when it suits them.

    Research has also shown that dogs are able to understand some aspects of human communication, perhaps because throughout history we’ve used dogs for their ability to respond to our commands. Words, hand signs and gestures, tone of voice and facial expression — it seems that dogs have the ability to understand them all. But what about human understanding of dogs?

    Federica Amici from the Max Planck Institute for Evolutionary Anthropology and colleagues ask just that in their latest piece of research, published in Scientific Reports.  First, the team recruited participants with a variety of experiences with dogs and who grew up in a variety of cultures, each with their own cultural attitude towards the animals. Parts of Europe, for example, have generally dog-positive cultures: people consider their dogs to be part of the family, and they live inside the house. In Muslim-majority countries, on the other hand, dogs live outdoors and are not considered to be family members.

    The researchers therefore recruited 88 adult and 77 child participants from four demographics: non-Muslim European dog-owners and non-Muslim European non-owners; Muslim non-owners from countries with a majority Islam population but who had lived in Europe for at least three years and Muslim non-owners living in a Muslim country. Regardless of demographic, there were a range of attitudes towards dogs amongst participants.

    Participants were then shown facial photographs of 20 dogs, 20 chimpanzees and 20 humans, all displaying various different emotions — happiness, sadness, anger and fear — or a neutral expression, and rated how much each picture represented each emotion. They were also asked about the context in which each photo was taken — did they think a dog was playing with a trusted friend, for example, or was it about to attack someone?

    Results suggested that although some ability to recognise dog emotions exists from early on in life, it is largely a skill we acquire through experience. The children’s ability to recognise emotions was similar across the board: experience with dogs, or growing up in a culture receptive to them, did not have much of an impact on how well children performed in the task.

    But in adults, cultural experience played a large role. Whether or not they owned dogs, participants who had grown up in a European, dog-positive culture were far better at recognising dog emotions than those who had grown up in a Muslim country (even if these participants had later moved to Europe). This pattern of results only held for dogs, too: all groups performed equally well when it came to assessing chimpanzee emotions.  As you might expect, both adult and child participants were able to recognise human emotions better than they were dog emotions.

    “These results are noteworthy because they suggest that it is not necessarily direct experience with dogs that affects humans’ ability to recognize their emotions, but rather the cultural milieu in which humans develop,” Amici says.

    Interestingly, all groups performed better if asked to recognise the context of images rather than asked to name the emotions outright, suggesting that our ability to interpret emotions may rely on the cues we get from any given situation.

    The focus of the research was somewhat narrow. Only a select few cultural groups were involved in the study, and it’s unclear whether the findings can be generalised to European and Muslim groups more broadly, as cultural attitudes can be complex and nuanced. Pictures of dogs only depicted those with a “German shepherd-like face”, meaning results may differ with other breeds and face types, and there are many people who have different experiences with dogs — non-owners who are experts, for example.

    But studying how experiential factors influence our recognition of animal emotions could be one strand in a further examination of cultural differences in emotion recognition. And it might just help you understand your own pet better, too.

    The ability to recognize dog emotions depends on the cultural milieu in which we grow up

    Emily Reynolds (@rey_z) is a staff writer at BPS Research Digest

    in The British Psychological Society - Research Digest on December 02, 2019 10:46 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    The Lancet, UNSW and Khachigian's cancer cure

    A dishonest cancer researcher. A dud cancer drug based on rigged lab data. A clinical trial in The Lancet. A greedy university which finds no misconduct. And a medical journal which tramples over patients.

    in For Better Science on December 02, 2019 06:59 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Pheromone Friday



    Pheromones, emitted chemicals that elicit a social response in members of the same species, have been most widely studied in insects as a mode of communication. In the insect world, pheromones can signal alarm, mark trails, control worker bee behavior, and elicit sexual behavior.

    Sex pheromones are the chemicals that come to mind in popular lore. Do human beings secrete substances that are likely to attract potential mates? Unscrupulous players in the fragrance industry would like you to believe that's the case. Unable to attract women (or men)? There's a difference between marketing an intoxicating and sensual fragrance that's pleasing to the nose and snake oil such as:




    Amazon even cautions prospective customers about SexyLife.





    {BTW, humans lack a functional vomeronasal organ, the part of the accessory olfactory system that detects pheromones / chemosignals / non-volatile molecules (Petrulis, 2013).}


    Don't we already know that human pheromones are a crock?

    It depends on how you define pheromone, some would say.1 “In mammals [rodents], few definitive cases have been identified in which single pheromone compounds evoke robust sexual behaviours, which might reflect an important contribution of signature mixtures in sexual communication” (Gomez-Diaz & Benton 2013, The joy of sex pheromones). In rodents, reproductive responses to “odor blends” or chemosignals are heavily modulated by experience, as opposed to the instinctive and fixed behaviors elicited by pheromones in insects. The evidence supporting the existence of mammalian pheromones is so weak that Richard Doty has called it The Great Pheromone Myth.

    If rats don't have “pheromones” per se, why look for them in humans? Tristram Wyatt, who believes that human pheromones probably exist, wrote a paper called The search for human pheromones: the lost decades. He criticized the literature on four androgen-related steroids (androstenone, androstenol, androstadienone and estratetraenol), saying it suffers from publication bias, small sample sizes, lack of replication, and commercial conflicts of interest. There is no bioassay-based evidence that these molecules are human pheromones, yet “the attraction of studies on androstadienone (AND) and/or estratetraenol (EST) seems unstoppable” (Wyatt, 2015).

    {Curiously, the SexyLife ad accurately lists the putative male pheromones, although their depicted functions are pure fantasy.}

    Unstoppable it is. Supporters of human pheromones have recently published positive results on male sexual cognition, male dominance perception, cross-cultural chemosignaling of emotions, and sex differences in the main olfactory system.2


    Olfactory Attraction

    On the other hand, a null finding from 2017 drew a lot of attention from popular media outlets and Science magazine, where the senior author stated: “I’ve convinced myself that AND and EST are not worth pursuing.” In that study, AND & EST had no effect on the participants' attractiveness ratings for photographs of opposite-sex faces (Hare et al., 2017).

    The evolutionary basis of Smell Dating was given a cold shower by studies showing that the fresh (and odorless) armpit sweat of men and women, when incubated in vitro with bacteria that produce body odor, were rated identically on pleasantness and intensity (reviewed in Doty, 2014). Meanwhile, the day-old smelly armpit sweat of men was rated as equally unpleasant by men and women.3 Likewise, pleasantness and intensity ratings for female armpit sweat did not differ between men and women. This doesn't bode well for heterosexual dating...

    Odors and fragrances are an important part of attraction, of course, but don't call them pheromones.


    Footnotes

    1 There is an accepted definition for "pheromone".

    2 Since humans don't have an accessory olfactory system with its fun vomeronasal organ, the main olfactory system would have to do the pheromone-detecting work.

    3 This could be due to larger apocrine glands, hairy armpits, and more carnivorous diets in men (Doty, 2014).


    Further Reading

    Scientific post in favor of human pheromones:
    “Whether one chooses to believe in the existence of human pheromones or not, steroids clearly serve an essential olfactory signaling function that impacts broadly ranging aspects of the human condition from gender perception to social behavior to dietary choices.”

    PET studies on AND, EST, and sexual orientation:

    References

    Doty RL. (2014). Human Pheromones: Do They Exist? In: Mucignat-Caretta C, editor. Neurobiology of Chemical Communication. Boca Raton (FL): CRC Press/Taylor & Francis; Chapter 19.

    Gomez-Diaz C, Benton R. (2013). The joy of sex pheromones. EMBO Rep. 14(10): 874-83.

    Hare RM, Schlatter S, Rhodes G, Simmons LW. (2017). Putative sex-specific humanpheromones do not affect gender perception, attractiveness ratings orunfaithfulness judgements of opposite sex faces. R Soc Open Sci. 4(3):160831.

    Petrulis A. (2013). Chemosignals, hormones and mammalian reproduction. Horm Behav. 63(5): 723-41.

    Wyatt TD. (2015). The search for human pheromones: the lost decades and the necessity of returning to first principles. Proc Biol Sci. 282(1804):20142994.


    in The Neurocritic on November 30, 2019 04:05 AM.

  • Wallabag.it! - Save to Instapaper - Save to Pocket -

    Request For Comments on “Data Repository Selection: Criteria That Matter”

    This post is a request for comments on the preprint Data Repository Selection: Criteria That Matter via this online form on behalf of the authors of the preprint (full list available there).

    Background

    At PLOS we have long advocated for and developed data policies to ensure that datasets, as well as other digital products associated with articles, are deposited and made accessible via appropriate repositories and platforms. Research questions, study design, and data all contribute to the full and accurate story of science. These policies are also, since 2016, informed by the Findable, Accessible, Interoperable and Reusable (FAIR) Data Principles.

    The organizations FAIRsharing and DataCite have joined forces with a group of publisher representatives (authors of this work, including PLOS) to propose a set of criteria that are important for the identification and selection of data repositories, which can be recommended to researchers when they are preparing to publish the data underlying their findings. 

    This work aims to 

    • reduce complexity for researchers when preparing submissions to journals 
    • increase efficiency for data repositories that currently have to work with multiple publishers
    • simplify the process of recommending data repositories for publishers

    This work aims to make the implementation of research data policies more efficient and consistent and improve approaches to data sharing by promoting the use of community-approved reliable data repositories. 

    Researchers who generate and reuse data are key stakeholders in the research data lifecycle. However, the first target audience for this work is other journals and publishers, repository developers and maintainers, certification and evaluation initiatives, and other policymakers.  

    We invite you to read the preprint that describes the work, its motivation, its relations to other initiatives, and provide feedback via this form

    in The Official PLOS Blog on November 29, 2019 04:45 PM.