Friday, 20 November 2015

The universe's resolution limit: why we may never have a perfect view of distant galaxies

James Geach, School of Physics, Astronomy and Mathematics
Can you make out the dot at the bottom of this question mark? What if you stand a few metres away? The finest detail the average human eye can distinguish is about the size of a full stop seen at a distance of a metre. This is called “resolution”. The best resolution for an optical system – like the eye – is roughly given by the ratio of the wavelength of the light you’re viewing in and the size of the aperture that light is passing through.

In astronomy, resolution works just the same. This explains why we build increasingly large telescopes: not only can big telescopes collect more light and therefore see further, the bigger the aperture of the telescope, in principle the better the image.

But now a new study has suggested that the universe actually has a fundamental resolution limit, meaning no matter how big we build our telescopes we won’t see the most distant galaxies as clearly as we would like.

The trouble with telescopes

The largest visible-light telescopes on Earth, such as the Very Large Telescopes and the Keck telescopes, have mirrors about ten metres in diameter, and there are now plans to build telescopes with diameters of 30m to 40m (so-called Extremely Large Telescopes). But there’s a problem: if light from an object (be it a candle, streetlight or star) is perturbed on its journey from source to detection, then we will never be able to produce an image as sharp as the theoretical maximum, no matter how big we make the aperture.

The huge primary mirror of the James Webb Telescope.

We know light can play tricks on us. Ever looked at the bottom of a swimming pool and seen the tiles appear to ripple and dance? Or put a straw into a glass of water and seen it seemingly “break” between the air and the liquid? Light travelling to our telescopes from space has to pass through a turbulent atmosphere, and this causes problems for astronomers.

Like a perfect parallel set of ocean waves encountering a submerged reef, the atmosphere disturbs the waves’ propagation. For electromagnetic waves – light – this has the effect of blurring images. Unless we compensate for it, it means we never reach the theoretical maximum resolution for a telescope. Putting telescopes in space, above the atmosphere, is one solution, but is costly. “Adaptive optics” is another, but is technically challenging.

Quantum foam

The new study, presented at the International Astronomical Union General Assembly this year, makes a prediction about the nature of space using the strange world of quantum physics. It argues that the nature of space-time on the quantum level might give rise to a kind of “fundamental resolution limit” of the cosmos, meaning there might be a cause to be concerned about how clearly future telescopes will be able to see the most distant galaxies.

The idea is as follows. According to quantum mechanics, on the smallest of scales, known as the Planck scale, some 10-35 m (yes, that’s a decimal point with 34 zeros after it before you get to the one), space is described as “foamy”. On those small scales, quantum physics predicts that the universe is seething with so-called “virtual particles” which pop into existence and then quickly annihilate each other – something seen constantly in particle physics experiments. However, for the briefest of moments those particles have energy and therefore – according to the famous equation E=mc2 – mass.

Any mass, no matter how small, is predicted to warp space-time. This is Einstein’s description of gravity. The most dramatic example of this phenomenon in nature is in the gravitational lensing of distant galaxies by massive clusters. Photons – particles of light – travelling through such foaming space-time would be affected by such fluctuations in a similar manner to light passing through our thick and turbulent atmosphere.
Of course, the effect is tiny – almost negligible. But a photon emitted from a distant galaxy making the journey across the universe has to travel a long way. On this journey, the countless “phase perturbations” caused by the foamy nature of space-time might add up. Now, the prediction is that this effect is smaller than even the finest images we can currently make with the best telescopes. But – if the theory is correct – then this cosmic blurring might be apparent in images of distant galaxies made by next-generation telescopes. These include the Hubble’s successor the James Webb Space Telescope, due for launch in 2018.

However, there is so far no accepted theory uniting Einstein’s view of gravity with quantum mechanics – that is one of the key goals of modern physics – so we should take this prediction with a pinch of salt. Even if it is correct, its effects will only really be frustrating to the group of astrophysicists studying the detailed structure of the most distant galaxies.

What’s fascinating is the implication that no matter how big we make our telescopes here on Earth or in space, there is a fundamental natural resolution limit to the universe beyond which we cannot probe, born out from quantum processes, but manifested on cosmological scales. Like a cosmic conspiracy, some of nature’s secrets may be forever concealed.

The Conversation
James Geach, Royal Society University Research Fellow, University of Hertfordshire
This article was originally published on The Conversation. Read the original article.

Friday, 6 November 2015

Six reasons why China's economy is weaker than you think

Geoffrey M. Hodgson, Hertfordshire Business School
The UK has rolled out the red carpet for Chinese president Xi Jinping on his five-day official visit. He is being given the royal treatment, including a stay at Buckingham Palace, a ride in a state carriage along The Mall and several banquets. The trip will also include plenty of time with the British prime minister, David Cameron, who is keen to discuss the trade and investment that the UK hopes to secure from the visit.
Britain’s pivot to China is largely based on its economic strength. And yet there is cause for concern. Having been the locomotive for global growth following the financial crisis in 2008, Chinese growth has now slowed and its economy is looking increasingly fragile. The latest GDP figures came in at just under 7%, significantly down from the astounding annual rate of more than 9% per year between 1990 and 2010.

Exports from China have declined, and exports to China must battle against the depreciating yuan. China’s slowdown has depressed global commodity prices, adversely affecting big exporting countries such as Brazil and Russia.

Some leading economists have been very optimistic about China. Nobel Laureate Robert Fogel published an article in 2010 that predicted that China’s GDP will grow at an average annual rate of more than 8% until 2040, when its GDP per capita would be twice that projected for Europe and similar to that in the United States. Fogel used a textbook method of analysis to predict an unrelenting upward path.

But as countries grow, their service sectors tend to increase as a proportion of output and employment. Rates of growth of productivity in services tend to be much lower than in manufacturing or agriculture. Hence, in any economy, growth rates are likely to slow down through changes in economic structure. There are several other reasons why China’s economic growth is set to stall.

1. Demographic shifts

China will experience an adverse demographic shift in the coming decades. Three decades of the one-child policy has reduced the number of adults of working age. The recent and ongoing relaxation of that policy, plus a big decline in infant mortality, increases the number of children. Older people are living longer, due to improved healthcare and reduced poverty. Hence the average number of children and old people, which needs to be supported by each person in work, is set to increase dramatically.

China’s population is ageing. Hung Chung Chih /

2. Chinese GDP per capita is still low

GDP is way below that of the US and other developed countries. World Bank Figures for 2014 put China’s GDP per capita at about 24% of that in the US. In the 20th century, only five countries managed to grow from 24% or less of US GDP per capita to 60% or more of US GDP per capita. They were Japan, Taiwan, South Korea, Singapore and Hong Kong. China still has a long way to go.

3. Lack of democracy

While there is some evidence that autocratic governments can help economic development at lower stages of development, particularly by promoting basic industry and infrastructure, there is strong evidence that democratic institutions are much more suited to higher levels of development. Notably, when Japan, Taiwan and South Korea reached about 45% of US GDP per capita, they were established or emerging democracies. A transition to a more democratic government may be necessary as China develops, but this would be very difficult to achieve – and could be highly disruptive.

4. Lack of openness

A democratic government is but one part of a constellation of vital institutions. As Nobel Laureate Douglass North and his colleagues have argued, dynamic modern economies need checks, balances and countervailing power to minimise arbitrary confiscation by the state. Legal systems have to develop significant autonomy from the political elite. In my book Conceptualizing Capitalism I show that absolute GDP per capita in a sample of 97 countries is strongly correlated with absence of corruption and openness of government. China is not an outlier in this test.

5. Problems with land and property rights

Unrest in the Chinese village of Shangpu was triggered over an unpopular land deal. REUTERS/James Pomfret

China’s population is divided into two classes. Chinese citizens are registered with either an urban or rural classification, depending on where they are born. Urban registrants have better education and health services.

Many rural registrants, meanwhile, have rights to the use of land. But these are often anulled after local party officials are bribed by business speculators and sell the land for profit. Frequent local protests result and the whole system of land use is in dire need of radical reform. Currently it fosters corruption and inhibits the skill development of half of the Chinese population.

6. Lack of homegrown talent

Although there are many small firms in China, there are still few mainland-registered large firms. Barry Naughton has noted that of the top ten firms in China exporting high-tech products, nine were foreign. Offshore registration is understandable, because fear of state sequestration persists in a country that did not recognise private property rights in its constitution until 2007. China’s financial system is very heavily concentrated in state hands, with punitive penalties on private lending.

Thus, there are weighty institutional and demographic drags on further rapid growth in China, especially as it enters intermediate levels of economic development that are ill-suited to the continuance of a one-party state. China can succeed, but only through massive and potentially destabilising reform of its political and economic institutions. We should not be surprised by even lower growth rates in the future.

The Conversation
Geoffrey M. Hodgson, Research Professor, Hertfordshire Business School, University of Hertfordshire
This article was originally published on The Conversation. Read the original article.

Wednesday, 4 November 2015

Unheeded cybersecurity threat leaves nuclear power stations open to attack

Nasser Abouzakhar, School of Computer Science
There has been a rising number of security breaches at nuclear power plants over the past few years, according to a new Chatham House report which highlights how important systems at plants were not properly secured or isolated from the internet.

As critical infrastructure and facilities such as power plants become increasingly complex they are, directly or indirectly, linked to the internet. This opens up a channel through which malicious hackers can launch attacks – potentially with extremely serious consequences. For example, a poorly secured steel mill in Germany was seriously damaged after being hacked, causing substantial harm to blast furnaces after the computer controls failed to shut them down. The notorious malware, the Stuxnet worm, was specifically developed to target nuclear facilities.

The report also found that power plants rarely employ an “air gap” (where critical systems are entirely disconnected from networks) as the commercial and practical benefits of using the internet too often trump security.

In one case in 2003, an engineer at the Davis-Besse plant in Ohio used a virtual private network connection to access the plant from his home. While the connection was encrypted, his home computer was infected with the Slammer worm which infected the nuclear plant’s computers, causing a key safety control system to fail. In another incident in 2006 at the Browns Ferry plant in Alabama the controllers for one of the reactors’ cooling pumps failed in reaction to a flood of network traffic on the plant’s internal computer network, which required the reactor to be shut down manually to avoid a meltdown. Although this wasn’t due to any sort of cyber-attack, it shows the susceptibility of industrial control systems to the sorts of events that could be triggered by malicious actors or by malware like Slammer.

The report also found that there is a general lack of knowledge of cybersecurity on the part of management who have generally shown a poor understanding of good “IT hygiene” and how it relates to security. It was quite common, the report said, for factory default passwords to be left unaltered and off-the-shelf software to be used despite known issues that were left unaddressed.

The problem is that the industrial communication protocols and mechanisms still commonly used in nuclear power plants were designed in an era before the internet and cyber-threats were a consideration. These are often insecure and not designed to deal with such challenges. Most of the legacy communication protocols such as Profibus, DNP3 and OPC are still vulnerable to various attacks as they lack any proper authentication techniques.

This means that all a malicious hacker might need to get inside a nuclear power station’s network is Google. Using search terms relevant to the software in use in the plant, Google can turn up direct links to websites leading into its network – with little or no security in the way.

An example of this is the technique used to hack internet-connected webcams. Searching for text used in the webcam login page, Google will turn up links to cameras all over the world. Many users fail to change the default username and password (which are easily found online), meaning that the cameras can be accessed and controlled with ease.

The same sort of techniques can be used to locate, not webcams, but web-connected industrial devices potentially providing access to important facilities. Search engines such as Shodan can identify these sorts of devices and even use geo-location to pin down their physical location.

There’s a lot of civil infrastructure, and a lot of it is vulnerable. Bill Ebbesen, CC BY

Mind the gaps

Unauthorised access by hackers to important systems in a power plant is a serious matter: anything that damages or disturbs the balance of operations within the plant could lead to a shutdown or even dangerous situations when shutdown routines fail, while power surges within the plant could affect transmission infrastructure outside. Whether we are talking about a nuclear power plant or not, the end result is likely to be production failures or financial losses, or even injury and death in. Of course, with a nuclear power plant the risks are that much greater because of the radioactive fuel in use.

Managing cybersecurity risks is challenging – and the Chatham House report makes several recommendations: integrated risk assessments to ensure security measures are properly implemented, and penetration testing where experts attempt to pry into and circumvent security measures, to ensure that the plant’s staff find any security holes before hackers do.

Organisations need to be far more aware of the potential effects of attacks: what could happen if various control systems were used incorrectly. This way it will become more apparent where resources should be dedicated towards protecting them. Only rigorous research and testing will develop the security approaches and technologies needed to respond to this quickly-evolving cybersecurity threat, keeping the power stations running and the lights on.

The Conversation
Nasser Abouzakhar, Senior Lecturer, University of Hertfordshire

This article was originally published on The Conversation. Read the original article.

Tuesday, 16 December 2014

Rapeseed oil versus olive oil

Guest blog Dr Richard Hoffman, School of Life and Medical Sciences

Rapeseed oil is increasingly touted as being as healthy as olive oil. But is there good evidence for this? In an article published in the British Journal of Nutrition, Dr Richard Hoffman, School of Life and Medical Sciences, reviews the evidence for the health benefits of rapeseed oil. Although rapeseed oil is rich in "good" fats - mainly monounsaturated fats and polyunsaturated fats - there is very limited evidence that this translates into a real reduction in disease risk. By contrast, the evidence for the health benefits of virgin olive oil are very strong, and this may be linked to the high levels of antioxidants in the oil; unfortunately these are lacking in rapeseed oil. So, based on current evidence, the conclusion from this study is that though it costs a little more, olive oil, especially virgin olive oil, is healthier and gives you more protection against disease.

Virgin olive oil is healthier and gives more protection against disease than rapeseed oil concludes a new study in the British Journal of Nutrition.

Although rapeseed oil is rich in "good" fats there is very limited evidence that this translates into a real reduction in disease risk. By contrast, the evidence for the health benefits of virgin olive oil is very strong, and this may be linked to the high levels of antioxidants in the oil; unfortunately these are lacking in rapeseed oil. So the conclusion from this study is that though it costs a little more, virgin olive oil is healthier and gives you more protection against disease.

Friday, 26 September 2014

Love's Passion: Philosophical Perspectives on Love

With the revival of interest in love, the Philosophy department at the University of Hertfordshire successfully hosted a two-day international workshop with twenty participants from ten countries. This was the first time that three different international research networks* were brought together.

Entitled Love's Passion: Philosophical Perspectives on Love, the workshop aimed to move the focus of discussion within the philosophy of love to issues such as love’s intentionality, the link between love and desire and the connection between love, virtue and the good. Another objective was to lay down the groundwork for a larger companion event on Love and the Good, due to be held in the Czech Republic in the summer of 2015.

Tony Milligan
Organised by Tony Milligan from the University of Hertfordshire and Kamila Pacovská from the University of Pardubice, the participants of the workshop were drawn from various philosophical traditions, from analytic philosophers and Wittgensteinians through to phenomenology and continental philosophy.

Milligan, a lecturer at the University, revealed that a number of the papers discussed at the event have already been earmarked for publication in English-language publications (and, in one case, a French journal on political philosophy).

The prospect is that an edited volume and/or special edition of a journal will be produced once the larger picture of ongoing research Love and its Object by Christian Maurer, Tony Milligan and Kamila Pacovská which is due out with Palgrave Macmillan later this year.
emerges at next year’s conference in the Czech Republic. This will complement the edited, new directions, volume on

Read below for the full event summary:

Roberto Merrill
Day one opened with a Wittgensteinian-influenced paper by Niklas Forsberg, Uppsala University, on ‘Thinking About a Word – Love for Example’ and was followed by:
Julia’s paper began to bring in the work of Iris Murdoch into focus. Discussions highlighted the extent to which traditions outside of the recent analytic debates could supplement and be brought into discussion with the precision aimed at in the latter. Roberto’s paper also helped to highlight the potential for a discourse on love and political philosophy.

Day two included a postgraduate session with excellent short papers from Monica Roland, University of Oslo, tackling ‘Velleman on the Maximum Reasons for Love’ and from Robbie Kubala, Columbia University, dealing with ‘Proust on the Reasons for Love’. Roland drew out the point that a skewed understanding of one of the seminal papers on love may well have shaped the discourse. Kubala delivered an analytic presentation on the sorts of questions about love which emerge in one of the key, exemplary, literary texts which are familiarly drawn upon by philosophers of love.

Kamila Pacovská, delivered a fascinating full-length version of her paper on ‘Loving the Miserable’, with a focus on Simone Weil. Maria Silvia Vaccarezza, University of Genova, picked up on the connection between Murdoch and Weil in ‘Emotion or Virtue’, a paper which drew upon her important work as an Aquinas translator. Kate Larson, Södertörn University College, Stockholm, presented an amusing and very insightful paper on ‘Falling in and out of Love’ which drew connections between Murdoch and Plato’s concept of eros. The closing paper by Tony Milligan continued exploration of the themes opened up by Larson with a paper on ‘Abandonment and the Constancy of Love’, reworking an argument presented earlier in the summer at the Religion and Emotional Experience event at the University of Konstanz.

Philosophical thinking
*The three international research networks were Analytic Philosophy of Love, Continental Philosophy of Love and scholars with a particular interest in the work of Wittgenstein, Simone Weil and Iris Murdoch.

Tuesday, 5 August 2014

Master challenge: the UK food system

The UK food system is increasingly globalised, which means it can often be prone to periodic scares and crises. UK consumers will be only too aware of the challenges of the food system, which they experience through rising food prices and scares about the provenance of meat products. The Economic and Social Research Council (ESRC) and the Food Standards Agency (FSA), under the Global Food Security programme are funding five grants under the ‘Understanding the Challenges of the Food System’ call.

Dr Faith Ikioda and Dr Wendy Wills and other colleagues at the University of Hertfordshire have
received funding as part of this programme, to investigate the views and experiences of people aged 60+ in terms of how they acquire food.

Increasingly people are living longer in the UK and predictions say this is set to continue. A significant minority of older people have ongoing health conditions and for those aged over 85 up to two thirds has a disability or limiting long term illness. These older people might therefore become vulnerable through the food that they eat and this is therefore a research priority in terms of impact on the UK food system, quality of life for individuals, better public health outcomes, reducing the burden of disease and disability not to mention the resultant economic benefits for the UK.

The research will be undertaken over a two year period and will involve households of older people living in Hertfordshire and the surrounding area. For further information contact Dr Faith Ikioda or Dr Wendy Wills or visit

Friday, 25 July 2014

The RoboCup 2014 Final!

Guest blog Dr Daniel Polani, Adaptive Systems Research Group

We had played seven matches against the likes of Hamburg, Brazil and Indonesia and it was now time to face Japan’s CIT Brains in the final.

It was tough as the Japanese team took the lead scoring three goals and we thought it was going to be a walkover of the Germany vs. Brazil kind. However, our robots fought back and we managed to score twice before the whistle blew. It seemed that our robots were better on the long run and had the game gone on longer, we might have equalized. But it was an exciting game, with a final score of 2:3 to the new world champions.

We are delighted to have got to the final, which follows on from our 2nd place at the RoboCup German Open (April 2014) and 3rd place at the RoboCup Iran Open (April 2014).  I’d like to thank everyone for their support and I am delighted to say that Bold Hearts is Vice World Champion in RoboCup Kidsize Football!