Stories
Slash Boxes
Comments

Dev.SN ♥ developers

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2020-01-01 to 2020-06-30
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$3500.00
100.0%
Stretch Goal:
$2000.00

Currently:
$1254.52
62.7%

Covers transactions:
2020-01-01 00:00:00 ..
2020-06-30 21:00:33 UTC
(SPIDs: [1207..1407])
Last Update:
2020-07-01 02:02:58 UTC
--martyb


Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

Poll

How often do you click through and read the fine article?

  • Almost all the time
  • More often than not
  • Less often than do
  • When the topic interests me
  • Very rarely
  • Never - it would go against long-standing traditions!
  • Click what?

[ Results | Polls ]
Comments:0 | Votes:2

Site Funding Progress

Funding Goal
For 6-month period:
2020-01-01 to 2020-06-30
(All amounts are estimated)
Base Goal:
$2000.00

Currently:
$126.74
6.4%

Covers transactions:
2020-01-01 00:00:00 ..
2020-01-31 06:46:05 UTC
(SPIDs: [1207..1216])
Last Update:
2020-01-31 12:48:47 UTC
--martyb

posted by martyb on Saturday February 13 2021, @04:54PM   Printer-friendly

Title: Parsing C++ Is Literally Undecidable (2013)

--- --- --- --- Entire Story Below - Must Be Edited --- --- --- --- --- --- ---

Arthur T Knackerbracket has found the following story:

Many programmers are aware that C++ templates are Turing-complete, and this was proved in the 2003 paper C++ Templates are Turing Complete.

However, there is an even stronger result that many people are not aware of. The C++ FQA has a section showing that parsing C++ is undecidable, but many people have misinterpreted the full implications of this (understandable, since the FQA is discussing several issues over the course of its questions and does not make explicit the undecidability proof).

Some people misinterpret this statement to simply mean that fully compiling a C++ program is undecidable, or that showing the program valid is undecidable. This line of thinking presumes that constructing a parse tree is decidable, but only further stages of the compiler such as template instantiation are undecidable.

For example, see this (incorrect, but top-voted) Stack Overflow answer to the question What do people mean when they say C++ has “undecidable grammar”? This answer errs when it says: “Note this has nothing to do with the ambiguity of the C++ grammar.”

In fact, simply producing a parse tree for a C++ program is undecidable, because producing a parse tree can require arbitrary template instantiation. I will demonstrate this with a short program, which is a simplification/adaptation of what is in the FQA link above.

The parse tree for this program depends on whether TuringMachine::output is SomeType or not. If it is SomeType then ::name is an integer and the parse tree for the program is multiplying two integers and throwing away the result. If it is not SomeType, then ::name is a typedef for int and the parse tree is declaring a pointer-to-int named x. These two are completely different parse trees, and the difference between them cannot be delayed to further stages of the compiler.

The parse tree itself depends on arbitrary template instantiation, and is therefore the parsing step is undecidable.

In practice, compilers limit template instantiation depth, so this is more of a theoretical problem than a practical one. But it is still a deep and significant result if you are ever planning on writing a C++ parser.

Parsing, performance, and low-level programming.


Original Submission

posted by martyb on Tuesday December 29 2020, @09:07PM   Printer-friendly

Entire Boston Dynamics robot line-up dances in the new year:

Boston Dynamics is sending off 2020 with its most impressive robot video to date – showing off its entire range dancing to the classic song “Do You Love Me?”. The fun video offers the first glimpse at two Atlas robots working together while also highlighting just how quickly this technology is developing.

Back in 2018 Boston Dynamics released a cute video of its dog-like Spot robot dancing to “Uptown Funk”. The playful video was a fun little demonstration of Spot’s broad range of movements, exciting at the time but very simplistic looking back from the vantage of today. Now the company has stepped things up delivering a long choreographed dance video featuring not only Spot, but two Atlas robots and a special appearance from Handle, a wheeled model.

Released as a kind of New Year’s gift from the company, the video is the first look at two Atlas humanoid robots working together. Atlas, still technically a prototype robot, has demonstrated a stunningly rapid evolution over the past decade from barely being able to walk in 2013, to being allowed to roam tetherless in 2015, completing a spectacular parkour routine just three years later, and finally getting acrobatic last year.


Original Submission

posted by requerdanos on Wednesday December 23 2020, @12:12AM   Printer-friendly
from the ghost-in-the-machine dept.

Exploring the potential of near-sensor and in-sensor computing systems:

As the number of devices connected to the internet continues to increase, so does the amount of redundant data transfer between different sensory terminals and computing units. Computing approaches that intervene in the vicinity of or inside sensory networks could help to process this growing amount of data more efficiently, decreasing power consumption and potentially reducing the transfer of redundant data between sensing and processing units.

Researchers at Hong Kong Polytechnic University have recently carried out a study outlining the concept of near-sensor and in-sensor computing. These are two computing approaches that enable the partial transfer of computation tasks to sensory terminals, which could reduce power consumption and increase the performance of algorithms.

"The number of sensory nodes on the Internet of Things continues to increase rapidly," Yang Chai, one of the researchers who carried out the study, told TechXplore. "By 2032, the number of sensors will be up to 45 trillion, and the generated information from sensory nodes is equivalent to 1020 bit/second. It is thus becoming necessary to shift part of the computation tasks from cloud computing centers to edge devices in order to reduce energy consumption and time delay, saving communication bandwidth and enhancing data security and privacy."

[...] So far, the work by Chai and his colleagues primarily focused on vision sensors. However, near-sensor and in-sensor computing approaches could also integrate other types of sensors, such as those that detect acoustic, pressure, stain, chemical or even biological signals.

Journal References:
1.) Feichi Zhou, Yang Chai. Near-sensor and in-sensor computing, Nature Electronics (DOI: 10.1038/s41928-020-00501-9)
2). Feichi Zhou, Zheng Zhou, Jiewei Chen, et al. Optoelectronic resistive random access memory for neuromorphic vision sensors, Nature Nanotechnology (DOI: 10.1038/s41565-019-0501-3)
3.) Yang Chai. In-sensor computing for machine vision, Nature (DOI: 10.1038/d41586-020-00592-6)


Original Submission

posted by requerdanos on Sunday December 20 2020, @06:56PM   Printer-friendly
from the wintel-no-more? dept.

Microsoft may be developing its own in-house ARM CPU designs:

This afternoon, Bloomberg reported that Microsoft is in the process of developing its own ARM CPU designs, following in the footsteps of Apple's M1 mobile CPU and Amazon's Graviton datacenter CPU.

Bloomberg cites off-record conversations with Microsoft employees who didn't want to be named. These sources said that Microsoft is currently developing an ARM processor for data center use and exploring the possibility of another for its Surface line of mobile PCs.

[...] Even if Bloomberg's report proves 100 percent accurate, the end result is likely to follow Amazon's lead much more closely than Apple's. Although Amazon tightened its supply chain by producing its own Graviton hardware, its software ecosystem remains open—without solid Linux operating system support, a server's future in a data center is very poor indeed. Microsoft would face the same challenges with a data center-focused product, and for the same reasons—although the "less likely" Surface ecosystem would be considerably less constrained.


Original Submission

posted by requerdanos on Sunday December 20 2020, @02:11PM   Printer-friendly

Plants can be larks or night owls just like us:

Plants have the same variation in body clocks as that found in humans, according to new research that explores the genes governing circadian rhythms in plants.

[...] These rhythmic patterns can vary depending on geography, latitude, climate and seasons - with plant clocks having to adapt to cope best with the local conditions.

[...] To investigate the genetic basis of these local differences, [researchers at the Earlham Institute and John Innes Centre in Norwich] examined varying circadian rhythms in Swedish Arabidopsis plants to identify and validate genes linked to the changing tick of the clock.

Dr Hannah Rees, a postdoctoral researcher at the Earlham Institute and author of the paper, said: “A plant’s overall health is heavily influenced by how closely its circadian clock is synchronised to the length of each day and the passing of seasons. An accurate body clock can give it an edge over competitors, predators and pathogens.

“We were interested to see how plant circadian clocks would be affected in Sweden; a country that experiences extreme variations in daylight hours and climate. Understanding the genetics behind body clock variation and adaptation could help us breed more climate-resilient crops in other regions.”

[...] "It’s amazing that just one base-pair change within the sequence of a single gene can influence how quickly the clock ticks," explained Dr Rees.

(Emphasis from original retained.)

Journal Reference:
Hannah Rees, Ryan Joynson, James K.M. Brown, et al. Naturally occurring circadian rhythm variation associated with clock gene loci in Swedish Arabidopsis accessions, Plant, Cell & Environment (DOI: 10.1111/pce.13941)


Original Submission