New insights into the origins of mutations in cancer

Image result for New insights into the origins of mutations in cancer

Researchers at the European Bioinformatics Institute (EMBL-EBI), the University of Dundee and the Wellcome Sanger Institute have used human and worm data to explore the mutational causes of cancer. Their study, published today in Genome Research, also shows that results from controlled experiments on a model organism — the nematode worm C. elegans — are relevant to humans, helping researchers refine what they know about cancer.

Enigmatic DNA mutation and repair

Cancer is caused by DNA mutations which can be triggered by a range of factors, including UV radiation, certain chemicals and smoking, but also errors occurring naturally during cell division. A cell recognises most of these mutations and corrects them through multiple repair mechanisms. However, DNA repair is not perfect, so it can leave certain mutations unrepaired or repair them incorrectly leading to changes in DNA. Understanding the footprints of these mutational processes is an important first step in identifying the causes of cancer and potential avenues for new treatments.

“The DNA mutations we see in cancer cells were caused by a yin and yang of DNA damage and repair,” explains Moritz Gerstung, Research Group Leader at EMBL-EBI. “When we study a patient’s cancer genome, we’re looking at the final outcome of multiple mutational processes that often go on for decades before the disease manifests itself. The reconstruction of these processes and their contributions to cancer development is a bit like the forensic analysis of a plane crash site, trying to piece together what’s happened. Unfortunately, there’s no black box to help us.

Controlled experiments in model organisms can be used to mimic some of the processes thought to operate on cancer genomes and to establish their exact origins.”

What worms can tell us

Previous research has shown that one of the first DNA repair pathways associated with an increased risk of cancer is DNA mismatch repair (MMR). The current study uses C. elegans as a model system for studying MMR in more detail.

“Dr Bettina Meier in my team initiated this project by assessing the kinds of mutations that arise when C. elegans is defective for one specific DNA repair pathway,” says Professor Anton Gartner, Principal Investigator in the Centre for Gene Regulation and Expression at Dundee. “As it only takes three days to propagate these worms from one generation to the next, the process of studying how DNA is passed on is greatly expedited. DNA mismatch repair is propagated for many generations and this allowed us to deduce a distinct mutational pattern. The big question was if the same type of mutagenesis also occurred in human cancer cells.”

To address this question, EMBL-EBI PhD student Nadia Volkova compared the C. elegans results with genetic data from 500 human cancer genomes.

“We found a resemblance between the most common signature associated with mutations in MMR genes in humans and the patterns found in nematode worms,” explains Volkova. “This suggests that the same mutational process operates in nematodes and humans. Our approach allows us to find the exact profile of MMR deficiency and to understand more about what happens when DNA repair goes wrong.”

These findings could lead to a better understanding of the causes of cancer and potentially help to identify the most appropriate treatment.

[“Source-sciencedaily”]

Legislators Are Missing the Point on Facebook

Legislators Are Missing the Point on Facebook

HIGHLIGHTS

  • The real issue: How our data get used
  • Facebook has been quite open and obvious about their skeevy practices
  • America needs a smarter conversation about data usage

I’m getting increasingly baffled and disappointed by the scandal-cum-congressional-ragefest surrounding Facebook. Instead of piling on Mark Zuckerberg or worrying about who has our personal data, legislators should focus on the real issue: How our data get used.

Let’s start with some ground truths that seem to be getting lost:

– Cambridge Analytica, the company the hoovered up a bunch of data on Facebook users, isn’t actually much of a threat. Yes, it’s super sleazy, but it mostly sucked at manipulating voters.

– Lots of other companies – maybe hundreds! – and “malicious actors” also collect our data. They’re much more likely to be selling our personal information to fraudsters.

– We should not expect Zuckerberg to follow through on any promises. He’s tried to make nice before to little actual effect. He has a lot of conflicts and he’s kind of a naive robot.

– Even if Zuckerberg was a saint and didn’t care a whit about profit, chances are social media is still just plain bad for democracy.

Politicians don’t want to admit that they don’t understand technology well enough to come up with reasonable regulations. Now that democracy itself might be at stake, they need someone to blame. Enter Zuckerberg, the perfect punching bag. Problem is, he likely did nothing illegal, and Facebook has been relatively open and obvious about their skeevy business practices. For the most part, nobody really cared until now. (If that sounds cynical, I’ll add: Democrats didn’t care until it looked like Republican campaigns were catching up to or even surpassing them with big data techniques.)

What America really needs is a smarter conversation about data usage. It starts with a recognition: Our data are already out there. Even if we haven’t spilled our own personal information, someone has. We’re all exposed. Companies have the data and techniques they need to predict all sorts of things about us: our voting behaviour, our consumer behaviour, our health, our financial futures. That’s a lot of power being wielded by people who shouldn’t be trusted.

If politicians want to create rules, they should start by narrowly addressing the worst possible uses for our personal information – the ways it can be used to deny people job opportunities, limit access to health insurance, set interest rates on loans and decide who gets out of jail. Essentially any bureaucratic decision can now be made by algorithm, and those algorithms need interrogating way more than Zuckerberg does.

To that end, I propose a Data Bill of Rights. It should have two components: The first would specify how much control we may exert over how our individual information is used for important decisions, and the second would introduce federally enforced rules on how algorithms should be monitored more generally.

The individual rights could be loosely based on the Fair Credit Reporting Act, which allows us to access the data employed to generate our credit scores. Most scoring algorithms work in a similar way, so this would be a reasonable model. As regards aggregate data, we should have the right to know what information algorithms are using to make decisions about us. We should be able to correct the record if it’s wrong, and to appeal scores if we think they’re unfair. We should be entitled to know how the algorithms work: How, for example, will my score change if I miss an electricity bill? This is a bit more than FCRA now provides.

Further, Congress should create a new regulator – along the lines of the Food and Drug Administration – to ensure that every important, large-scale algorithm can pass three basic tests (Disclosure: I have a company that offers such algorithm-auditing services.):

– It’s at least as good as the human process it replaces (this will force companies to admit how they define “success” for an algorithm, which far too often simply translates into profit),

– It doesn’t disproportionately fail when dealing with protected classes (as facial recognition software is known to do);

– It doesn’t cause crazy negative externalities, such as destroying people’s trust in facts or sense of self-worth. Companies wielding algorithms that could have such long-term negative effects would be monitored by third parties who aren’t beholden to shareholders.

I’m no policy wonk, and I recognise that it’s not easy to grasp the magnitude and complexity of the mess we’re in. A few simple rules, though, could go a long way toward limiting the damage.

[“Source-gadgets.ndtv”]

Moto G6, Moto G6 Play, Moto G6 Plus to launch today: Here’s how to watch the live stream at 7PM IST

Moto-G6-leak

Lenovo-owned Motorola is all set to launch the Moto G6-series smartphones later today. The event is set to take place in Sao Paulo, Brazil, and it will kick off at 10:30AM (Brazil local time), which is 7:00PM in India. Motorola will also live stream the event on its Facebook Page here.

The Moto G6-series will include the Moto G6, the Moto G6 Play and the Moto G6 Plus variants. The Moto G6 Play is expected to be an entry-level variant with single camera at the back, whereas the Moto G6 and Moto G6 Plus are expected to feature dual cameras at the back. All three smartphones are expected to come with full-screen displays and 18:9 aspect ratio.

A home button with fingerprint sensor embedded inside it is also expected under the display. In the hardware department, the Moto G6 Play is expected to be powered by a Snapdragon 430 quad-core SoC, and feature a massive 4,000mAh battery.

The Moto G6 is expected to be powered by a Snapdragon 450 octa-core SoC paired with 3GB / 4GB RAM and 32GB / 64GB on board storage. Talking about dual cameras, one is expected to be a 12-megapixel primary sensor along with 5-megapixel secondary sensor to add DSLR-like bokeh effects to your photos. A 16-megapixel front camera is also expected to be in tow.

Moto G6, Moto G6 Plus show up on retail website, revealing prices

Also Read

Moto G6, Moto G6 Plus show up on retail website, revealing prices

Lastly, the Moto G6 Plus will likely be powered by a Snapdragon 630 or Snapdragon 660 octa-core SoC, with 4GB of RAM and 64GB onboard storage. It is expected to sport the same dual cameras at the back, as the Moto G6. All three smartphones in the Moto G6-series are expected to run Android 8.1 Oreo out of the box.

Moto G6 Plus, Moto G6 and Moto G6 Play renders show off the phones in various color options

Also Read

Moto G6 Plus, Moto G6 and Moto G6 Play renders show off the phones in various color options

In terms of pricing, the Moto G6 Play is expected to be around $200, which is about Rs 13,000, the Moto G6 around $250, which is about Rs 16,300. There is no word on the pricing of Moto G6 Plus, but with the launch event less than an hour away, we will know more about it when Motorola officially makes the announcement.

[“Source-bgr”]

Watch The Launch: NASA’s TESS Blasts Off From Cape

 

TESS—NASA’s newest planet-hunter—is now in space.

Sitting atop a Falcon 9 rocket, the robotic probe launched from Florida’s Cape Canaveral Air Force Station at 6:51 PM ET.

So far, there’s not a hitch in sight.

Says Sara Seager, MIT astrophysicist and the mission’s deputy director of science: “This is a wonderful celebration.”

And only the beginning.

TESS—$337 million and about the size of a stacked washer-dryer—will see “almost the entire sky,” says NASA.

And discover more worlds than ever before.

Credit: MIT

Artist impression of NASA’s TESS spacecraft.

During the two-year mission, NASA expects TESS to find perhaps 20,000 exoplanets.

Or more. The Kepler Space Telescope looked at less than one percent of the sky—and detected nearly 5,400 planets (with about 2,700 now confirmed).

But most Kepler planets “are too distant and too dim to do any follow-up observations,” says Jeff Volosin of NASA’s Goddard Space Flight Center.

Instead, TESS will point its four cameras at 200,000 of the brightest, closest stars—30 to 100 times brighter than Kepler’s targets, and “only dozens to hundreds of light-years away,” says Volosin.

In space, that’s close—even though a single light-year equals almost six trillion miles.

Among the thousands of discoveries, NASA hopes to find hundreds of worlds reasonably near the size of Earth.

“Bigger than Earth but smaller than Neptune,” says Seager. “These planets are still a big mystery. Are they giant rocky planets? Or water worlds?”

Credit: ESO / M.Kornmesser

Artist impression of a super-Earth.

Don’t expect discoveries within days. Once in space, TESS scientists have to check out the probe. That takes two months.

Says Seager: “You have to wake up one part at a time to make sure everything works, and works together.” Detection announcements will follow, probably in a few more months.

To find a new world, TESS will look for “transits”—eclipse-like events, when a planet passes between its star and the spacecraft.

As that happens, the planet blocks a bit of the star’s light; that dip in the light, minuscule but measurable, tells scientists that something might be there.

But TESS can’t tell if the planets are life-friendly. Discerning those details will be left for future space probes.

That includes NASA’s James Webb Space Telescope, launching in 2020—and ARIEL, a new mission from the European Space Agency, slated for a 2028 liftoff.

TESS, says Giovanna Tinetti, ARIEL’s principal investigator, “will clearly provide most of our exciting targets.”

Webb and ARIEL will analyze the atmospheres of TESS planets, searching for biosignatures—gases that indicate the possibility of life, like oxygen.

“Within the next decade,” says Volosin, “we hope we can identify the potential for life to exist outside our solar system.”

Credit: NASA’s Goddard Space Flight Center / Chris Meaney

Artist impression. TESS in space.

But as NASA looks ahead, launch day is a time to look back.

Just two decades ago, many astronomers thought exoplanets were nearly nonexistent. Our solar system, with eight major worlds, was believed a quirk.

“Exoplanets were just considered silly,” says Seager. “Twenty years ago, it was insane to search for exoplanets.

“But the line between what’s considered mainstream and what’s considered crazy is constantly shifting.”

And now, TESS has launched. “And now,” says Seager, “it’s so mainstream.”

[“Source-forbes”]