Future Trends in Energy

After Einstein’s work showing that matter and energy are interchangeable, we can say now that the universe is simply a great amount of energy in different states. Rational beings are some kind of aggregated energy in a certain form, a form that tries to preserve its state along time. Human activities are driven by the natural need to get energy in order to preserve the human life, to develop it and to multiply it. Civilizations are the result of the change the natural impulse of searching for food (the energy source for our bodies) by a planned way to get it. It is not an exaggeration saying that energy is the only important thing in our human lives because energy is, basically, the only thing existing in the universe.

Modern civilizations are very dependent from energy sources. A modern society cannot be preserved without large energy sources. What makes a society more advanced is the capability to understand, to get and to use larger energy sources. We can think now, that what defined our modern societies in the twentieth century is the result of mankind getting energy from the atom. We are taking advantage of the division of heavy atoms in order to get a huge amount of energy. Nuclear power stations are without any doubt the reason for the social development of the occidental societies. The access to a huge energy source provides independence and security, and avoids the need of searching for large energetic resources in other countries. Although European societies have been and are very dependent from oil, their capability to move to another sources much more efficient than fossil combustibles put them in a good negotiation position with other countries.

The twenty first century will be defined by a different energy source. Although from the energetic viewpoint nuclear fission is very efficient (it produces a huge amount of energy from a little amount of combustible), nuclear fission has many problems that are not easy to be overcome, related to safety assurance, residual waste, and so on. Nowadays, energy industry is focused in getting energy from renewable sources; however, this is not a great technological advance. The concept of renewable sources is mostly the extraction of energy from the kinetic energy of the molecules in a fluid through a mechanical device. Conceptually, there is not a significant difference between the extraction of energy from the atmosphere with a wind turbine in order to feed an electrical mill that turns the electricity into mechanical work than the same process done by “a giant of Don Quixote” that turns directly the kinetic energy of the molecules in the atmosphere into mechanical work of the mill. The former one is more complex than the latter one but the mechanical work proceeds from the same energy.

The advances in energy technology have been driven by the increasing demand of energy in the modern societies. Human activities require more energy and the search for new energy sources more powerful has been common in the recent decades. Nuclear power is our last achievement; however, the game is not over.

In the previous century, science has produced two spectacular advances. The first one is Einstein’s relativity that shows a similar nature between space and time, and between matter and energy, and the second one is quantic mechanics, that shows the quantum and oscillatory nature of matter. Both advances have driven us towards a better comprehension of the universe and its energetic processes that we can try to simulate in order to take advantage of huge amount of energy.

All of the energy that we can use for our human purposes proceeds from the Sun. Without the energy of Sun it would be impossible to have any energy source on the surface of Earth like wind or water. Speaking strictly, there is not any renewable energy because even the Sun will burn out totally in a certain time. Of course, the most probable scenario is that there will not be any human being on Earth in that time.

If Sun is the primary energy source, and our terrestrial sources proceed from it, we can think that the most efficient way to get huge amount of energy should be the replication of the energetic processes in the Sun.

Astronomers have found that the energy emitted by the Sun and other stars is produced by nuclear reactions. However these kinds of reactions are fusion reactions instead of fission reactions. In a fission reaction the nucleus of a large atom is divided in several more little ones, however in a fusion reaction two nuclei of little atoms are joined to produce a larger one. The energy liberated is due to the creation of a link supported by the strong nuclear force. This kind of force acts a very tiny distances, and in order to produce the new link it is necessary to overcome the electrical repulsion due to the electric charge of the two nuclei that acts at larger distances.

Fusion reactions require a very high temperature in order to take place. On the Earth it is very difficult to create the required conditions of pressure and temperature to produce that kind of nuclear reactions in a continuous way. The lower the required temperature is, the higher our capability to produce it is. Although astronomers have found that in the universe nuclear fusion reaction that produce element until the atomic number of iron (Fe) liberating energy are common, the reaction that requires a lower temperature is the reaction that produces nuclei of helium (He) from two atoms of hydrogen (H).

Hydrogen exists in three different forms or isotopes: protium (H), deuterium (D) and tritium (T). With zero, one and two neutrons joined to a single proton.

The reaction that is produced at lower temperature is:

D + T -> 4He + n + 17.3 MeV

The other interesting reaction is between two nuclei of deuterium:

D + D -> T + H + 4 MeV

-> 3He + n + 3.5 MeV

These two reactions are equally probable.

Tritium is radioactive with a half-life of around 12 years, however deuterium is stable. Deuterium can be found in nature, because a significant percentage of Hydrogen in nature has that form in a natural way. It can be got from the water of sea at huge amounts that makes it practically inexhaustible, but tritium should be produced from another nuclear reaction. In general it can be got from Lithium (Li). Lithium exists on Earth at large quantities, it is estimated that the current (known) reserves of lithium could satisfy the current needs of energy during thousands of years through this kind of reactions.

Although a fusion reactor produces radioactive materials like tritium, its half-life is much shorter than, for instance, the residual plutonium of some kind of fission reactors, making it less dangerous and much more manageable.

The fusion energy could be the answer to the next challenges of mankind in the near future. One of the applications of this kind of energy can be space travel. A rocket powered by nuclear fusion can produce an impulse hundreds times the impulse of a chemical rocket. This fact can make easier any future interplanetary mission.

Fusion reactors are now in time of research and development. International community is developing a fusion reactor known as ITER that is only an experiment that could provide knowledge to build in the future demonstration reactors that could supply energy in an industrial way.

The construction of a fusion reactor is a great scientific and engineering challenge, because new advanced technologies of confinement supporting very high pressures and temperatures are required. Gravitational confinement is only possible in the stars (it happens in those ones known as brown dwarfs). The ITER follows the tokamak concept that is a design for magnetic confinement.

The advantages of fusion energy are enough in order to take it seriously into account as an important driver of scientific and technological development in the future years. The applications of a huge energy source can be unimaginable. Many people change erroneously technology by application. In press, many people consider facebook or twitter as modern technologies although they are simply applications; however, the technologies are the computers, the programming languages, the physical communication network, and, of course, the power plants generating energy and feeding them. We should forget never that, in fact, the entire universe, even us, is actually energy.

Other energy sources as those one known as renewable has got a long path yet, however, the path of fusion energy can be longer due to we are in the beginning yet, and future applications are not even imagined.


 Azul Gris 0002Mr. Luis Díaz Saco

Executive President


advanced consultancy services


Nowadays, he is Executive President of Saconsulting Advanced Consultancy Services. He has been Head of Corporate Technology Innovation at Soluziona Quality and Environment and R & D & I Project Auditor of AENOR. He has acted as Innovation Advisor of Endesa Red representing that company in Workgroups of the Smartgrids Technology Platform at Brussels collaborating to prepare the research strategy of the UE.


Information is not power but knowledge is it

In the era of big data, most people think that information is the final source of power; however, this is an erroneous concept. Wrong concepts drive to bad actions and finally to a loss of real power.

Information can be seen a set of data samples about a phenomenon. Knowledge is much more than this. Knowledge should be provided by a model of behavior of that phenomenon that let us to make predictions about the future. Power resides in the anticipation of the future that let us to take advantage of it.

This is not a new concept. Bertrand Russell wrote about how Chinese emperors protected Jesuits because they could predict better eclipses than their astronomers and this fact could be used to show the power of the Empire and its ruler to the people.

From ancient times, even political power was linked to knowledge, although for common people it was related to magic instead of science.

Espionage services are known as intelligence services for the same reason. Although the hard work of espionage is gathering hidden data, what provides value to a government is not the raw data but the processed data in order to make right decisions. Intelligence can be seen as the process of simplifying a large set of data in order to anticipate an action that provides us an advantage to reach some desired target.

In the era of big data this is valid again. The role of the data scientists is basic in order to take advantage of the gathered information by a system working with big data. However we are in the beginning of this era and the systems processing a large amount of data are very naïve yet, although this technology is increasing its performance fast day by day.

There are several things to be considered when we are working with a large amount of data:

  • We need to know the limits of information processing.
  • We need to know the issues related to the quality of information.
  • We need to know the effects of the noise in the data.

Gathering data do not provide itself a model of reality. This was scientifically demonstrated through the Nyquist’s theorem that shows that in order to recover a periodic continuous signal it is necessary to sample it at the double of the higher frequency of the signal. In simpler words, in order to get a perfect knowledge of a periodic phenomenon it is necessary to gather certain amount of data. This theorem goes farther and shows that sampling the signal at higher frequency cannot provide more information about it. In simpler words, getting more than the required information about a phenomenon will not provide more knowledge about it.

This is important in big data related systems. The knowledge about a phenomenon will not be improved increasing the gathered data when we have got a perfect model of it yet. Many companies can be selling nothing around the gathering information business.

Another important aspect is that models are a source of uncertainty because we need to make assumptions about reality in order to create them.

First of all, most statistical analyses try to simplify any phenomenon supposing that the problem is linear, however, this kind of simplification cannot be assumed many times because reality usually is non-linear.

When data scientists try to model a system, they usually try to use polynomials. Even if the system is really polynomial, in order to recover the system we need to know the degree of the polynomial. We will not get better results trying to fit a polynomial of higher degree to a set of data. The errors that can provide some kind of action can be huge.

In other words, the knowledge about the physics of the system will help us to calculate better a model of the system that try to use only computing brute force. Knowledge about the physics of the system is always better than the “stupid method” of the computer programmer.

Another issue to be considered is the following: Thinking only in Nyquist’s theorem we can recover a periodic signal although the data is biased if we sample the signal at the correct frequency. This is true, but, the reason is that we are applying scientific knowledge of physics instead of the “stupid method” of the computer programmer. We assume that the signal is periodic. However this is not true if we have not that knowledge about the signal. If the signal was polynomial, the bias can be important.

Imagine a system that we know it can be modelled as a polynomial. That kind of signal is not bandlimited and Nyquist’s theorem cannot be applied. If we only have data from the left side, all of them will have the same sign (for instance positive), and the sign of the right side will depend on the degree of the polynomial. If it is odd, the sign will be contrary but it if is even, the sign will be the same.

Figure 1

Figure 1 The same set of three numbers can be interpolated through the functions x2 and –x3/3 -2x/6

This fact is an example about the quality of the data. Not all the sets of data have the same quality to provide a model of a system. In order to make a good interpolation, it is required that the pieces of data are distributed uniformly on the entire system domain.

In the era of the big data, this is especially important, and many times is not well analyzed. Although I can think about some example related to social affairs, I will try to be politically correct and I am going to put an example about myself. Imagine that a job seeker company is trying to classify me through a big data system in order to determine which job is better for my profile. I have been working in a pure science research center as a researcher and I have been working in a quality engineering company as a manager. The first fact would say about me that I am interested in creative positions trying to provide new things for mankind out of the standard way of working; however, the second one would say about me that I am an expert in standardization of tasks and I would be interested in defining and following rules.

That profile is too complex to be included in an easy classification, and it would be even more complex if we continue advancing along time. Any biased information to the left or the right of the temporal axis would drive to an incorrect classification. In this case, it is important to analyze the whole profile and to have a more complex classification method for more complex profiles. One of the reasons for big data analysis is the analysis of more complex situations. The more complex the system is, the larger the amount of required data is, and the more complex the analysis procedure is. As in the example of espionage and intelligence, large scale analysis requires large scale intelligence. That is the reason why the development of artificial intelligence has become fashionable in the era of big data.

The immunity to noise is very important when we are trying to define the model of a system. What happens when there are pieces of incorrect data inside the gathered data? Imagine a linear system and an analysis algorithm that tries to determine the order of the system that fits better the data.  A little shift of a few data can produce a result of increasing the order of the system that fits perfectly the data. In any system for analyzing data, we need some kind of method to filter the data trying to avoid the negative effect of noise.

Again this is not a definitive solution. Filters would eliminate special cases. Filters would search for a mean behavior eliminating the data far from that mean behavior. This kind of solution applied to a complex system could not be good enough.

From this point, we can understand that complex systems are not easily modelled. The use of generic modelling techniques for complex systems under great uncertainty can drive to very bad results.

Knowledge is power, but we must be assured that we have got the proper knowledge. Scientific knowledge advances along time and some people produce great leaps in the knowledge owned by mankind that improves our technology and our lives. Any model has a domain of application, out of it, it cannot be considered knowledge. Newton’s physics about space and time can help us to determine the time we will spend in a travel by car but without the Einstein’s physics we could not have satellite communications. Namely, limited knowledge provides limited power. The reason for basic research is a matter of power too. A better knowledge provides a higher power to mankind.

Scientific knowledge is linked to the limits; however, the use of computer algorithms without taking the limits into account drives to very wrong decisions.

Back to the previous example, including the big data and the social networks, I have a lot of contacts in the quality business, most classification algorithms would produce an analogy between my profile and the profile of these contacts, however, although it is true that I have got a great knowledge about standardization and managing methodologies like them, my role inside a quality organization would totally different because I was an innovation manager. Although the role of a quality expert is to provide uniformity in the procedures in order that things can be done in repetitive and secure way increasing the capability of the managing staff to assure the goodness of the provided product and services, the role of an innovation manager is to drive changes in the organization in order that it can provide new products and services. Classifying people through a single common word does not define the different role of different persons in an organization. This cannot be done simply counting the number of links among different kind of people.

On the other hand, the risk of assuming different things as equal in order to simplify the analysis process in a computer program is not economic. Day by day many businesses are driven to the majority because through internet it is possible to provide expensive services at a low price due to the huge volume of market that compensate the required investment. If there are a few unsatisfied clients due to an incorrect segmentation do not produce a great economic loss, and providing different products for them could not be so profitable.

The risk is related to knowledge and power. In a global market companies will get more incomes from mediocrity than investing in improving the quality of their products. High quality products would never be interesting if educated people are wrongly classified. Always there has been a different demand by common people and educated one. Think for instance in music. Cult music, commonly named classic, has existed from many centuries ago sharing the time with the popular one. In the previous centuries, writing cult music provided much more incomes than writing popular music because a rich man could pay much more than a little group of poor people. Nowadays, the situation is contrary. No man can pay as much money as a musician can get selling his song on iTunes at the economic price of one dollar all over the world through internet. The result is obvious. Everybody knows now who was Mozart and he was famous when he was alive, but nobody knows current musicians writing cult music, and everybody knows the name of many musicians writing popular music that are in touch with the higher social classes due to their huge earnings.

The question arising here is: If technical and social improvements are driven by advanced knowledge provided by selected people, can the future society improve in a world built for mediocrity? There is something that can help to avoid this: competition. Man reached the moon due to the space competition between USA and USSR instead of a demand of people to do it. Competition among companies can produce the search for better products and services for people that do not demand them although market trends can be pointing to mediocrity. Competition among companies can produce better tools for classification and market analysis, and it can impulse the customization of solutions for different kind of people because a better segmentation reduces the cost of that customization process.

Competition is what will be demanding new and better products and services and will make mankind advance after the era of the big data. A society or organization where competition, making new things, or enjoying with different thoughts and solutions than majority are penalized will be a society with less power driven to be stuck into mediocrity.

In order to finish this discussion, I would propose that you take complexity analysis techniques into account in order to cope with a huge amount of data where traditional statistics and classic modelling techniques cannot be effective enough. This techniques can provide knowledge about the structure of the system that will be useful to increase our capability to make better decisions.


 Azul Gris 0002Mr. Luis Díaz Saco

Executive President


advanced consultancy services


Nowadays, he is Executive President of Saconsulting Advanced Consultancy Services. He has been Head of Corporate Technology Innovation at Soluziona Quality and Environment and R & D & I Project Auditor of AENOR. He has acted as Innovation Advisor of Endesa Red representing that company in Workgroups of the Smartgrids Technology Platform at Brussels collaborating to prepare the research strategy of the UE.


Two different models of national innovation management

When we analyze the recent crises we usually look at the financial world as if all the answers were there, however, this is the main error of the analysts. Crises can be financial or can proceed from the real economy, however, financial solutions only fit the short-term, and long-term solutions must be provided by the real economy. Both worlds, the real world and financial world, are totally linked. That is the reason why we cannot rule with a single vision, either financial or industrial.

Long term solutions for any economy will require an improvement of competitiveness that can be sustainable if it is based on innovation.

Innovation is the result of different processes; however, as now I am writing about long-term, I am going to use R & D activity as the field of analysis.

If we analyze the main economies of the Euro zone, we will find that the effort in R & D is different in different countries. This implies that there is a structural difference between countries that will drive in a different way the evolution of the future economy at the Euro zone.

Gross Domestic Expenditure in R & D as percentage of the GDP. Data Source OECD

Gross Domestic Expenditure in R & D as percentage of the GDP. Data Source OECD

Germany and France double the expenditure in R & D of the Southern countries. The differences among economies are structural and they are driven by their governments (regardless of the political color) as we will see in the following discussion.

These differences proceed from the past. An important increase of the expenditure in R & D is not a simple matter because to have expert resources is necessary. It is not a matter of increasing the funds for the activity simply.

Related to human resources, we can see that there are important differences:

Total people dedicated to R & D. Data Source: OECD

Total people dedicated to R & D. Data Source: OECD

As we can see Spain and Italy have a similar number of people dedicated to R & D although the GDP of Italy is higher.

This can be better seen with the next graph that makes a comparison with the total number of jobs.

Ratio R & D Staff per Total Jobs. Data Source: OECD

Ratio R & D Staff per Total Jobs. Data Source: OECD

As we can see France and Spain have more people dedicated to R & D that it should be expected by the total expenditure in relation to Germany and Italy. The reasons for this fact can be linked to the political vision of innovation and the role of the states in the R & D activity. In order to take this issue into account I am going to make a comparison between the largest and the smallest economy of the analysis.

In the case of Germany, the amount of R & D performed by business enterprises is much higher in percentage, as we can see in the following graphs:

Business Enterprise Expenditure and Gross Domestic Expenditure in R & D of Germany. Data Source: OECD

Business Enterprise Expenditure and Gross Domestic Expenditure in R & D of Germany. Data Source: OECD

As we can see most R & D is performed by the private sector. This is not the case of Spain.

Business Enterprise Expenditure and Gross Domestic Expenditure in R & D of Spain. Data Source: OECD

Business Enterprise Expenditure and Gross Domestic Expenditure in R & D of Spain. Data Source: OECD


In Spain, only one half of GERD is performed by private companies. Spain has a high ratio research jobs per total jobs due to most of them are in the public administration. The effect of this fact is that innovation is father from markets in Spain than in Germany. The capability to influence the future economy of the country will be lower in Spain than in Germany because the connection of R & D and markets is much weaker.

At a fast glance, we can see how the crisis of 2007 has been focused in a very different way. While Germany increased linearly both GERD and BERD, Spain stopped the growth of the R & D expenditure in both private and public sector. This is not surprising as Spain had to face a huge financial crisis in its banking sector that implied lower and more expensive financial resources especially for the activities with higher risk. This is a clear case of how a financial problem affects not only the present real economy but the future one.

Looking at the previous graphs we can think that policies of state expenditure in R & D have not influence in the productive model, or they are inefficient, however, this is not true. The difference is not the public support but how the public support is implemented. You can see this fact in the following graphs.

Financed and performed percentage of R & D by Business enterprises in Germany. Data Source: OECD.

Financed and performed percentage of R & D by Business enterprises in Germany. Data Source: OECD.

Looking at this graph we can see how at the beginning of the century only less than 50 % of the performed R & D activity in Germany was financed by the industry. This implies that the rest of the activity was financed with public resources. German State was funding strongly R & D at the private sector making stronger private companies with public resources.

Financed and performed percentage of R & D by Business enterprises in Spain. Data Source: OECD.

Financed and performed percentage of R & D by Business enterprises in Spain. Data Source: OECD.

In Spain, we can see how the support of the state for the business enterprise R & D is anything but noticeable especially compared with the German case. Spanish industry funds almost all its R & D activity. Public financial resources for R & D (human and financial) are consumed by public institutions very far from markets.

The results of these two different models of innovation management will be seen in the future. We will see if having a lot of support from the state is better or if without having any support from the state will produce stronger companies. As a conclusion we can say that Spain has improved strongly its productivity in the last years but it was done through the salaries of the staff. Public resources have gone to restructure its financial system, and to preserve the public jobs instead of improving the future performance of the real economy. This is not surprising due to the lack of resources, however it can be surprising that this was not done in the first years of this century when its economy was growing faster than in most European countries and there were a lot of resources for this task with surplus in the public accountancy. The reasons for this fact are structural instead of temporal and it does not seem to be influenced by the political color of a certain government.

 Azul Gris 0002Mr. Luis Díaz Saco

Executive President


advanced consultancy services


Nowadays, he is Executive President of Saconsulting Advanced Consultancy Services. He has been Head of Corporate Technology Innovation at Soluziona Quality and Environment and R & D & I Project Auditor of AENOR. He has acted as Innovation Advisor of Endesa Red representing that company in Workgroups of the Smartgrids Technology Platform at Brussels collaborating to prepare the research strategy of the UE.


Future trends: Virtual Reality

Simulation is a very important technique used in the industry. From the beginning of digital computers, one of the main uses of them has been the execution of simulation algorithms to make predictions about the evolution of many kinds of processes. With the evolution of computer devices to interact with the user and other industrial machines, simulation acquired a different dimension where the simulator could interact with the real world substituting some part of that real world.

Simulators provide a lot of value in the field of engineering, because they let that engineers can test different solutions with a very low cost scheme. Simulators are of wide use too for training of expert technicians, pilots, and so on.

Simulation is very common tool in engineering, engineers have a lot of tools to simulate different solutions before they select one of them in order to develop or build a new device or infrastructure.

A useful simulator for training can be a very powerful tool, however, the higher the precision of the simulator is, the higher the cost of development is. This makes the market of highly precise simulators very little. They are built only for certain applications that require the use of very expensive and complex systems and any reduction of the number of fails in their manipulation produces big savings.

A virtual aerial tour over Madrid Barajas Airport towards Madrid City

Picture 1 A virtual aerial tour over Madrid Barajas Airport towards Madrid City with Flightgear free Flight Simulator

This kind of simulators is a tool for risk reduction. For instance, a trained operator of a nuclear facility will reduce the probability of higher damage for the people in an accident because his response to the crisis will be faster and safer.

These simulators usually have a real interface connected to a computer that simulates the system under control. The use of virtual reality instead, would provide lower costs of development of the simulator and some advantages; because we can substitute real panels by virtual ones, and even extend the simulation to the environment similar to the real facility out of the control panel.

Virtual environments are not only useful to train people but even to test new devices. Imagine that you have modeled an industrial automated facility in a virtual environment, you can test in the traditional way a new device adding the virtual model of the device to the virtual environment but you can connect the real device to the virtual environment in the same way as an operator interact with a training simulator. In this way you can test how a new real device will improve your automated facility without stopping the production at the real facility. This technique is known too as virtual commissioning. With virtual commissioning you can test how different components of an industrial facility are working adding the real devices one by one. This technique lets you to identify possible fails of a component in a very complex system.

This kind of virtual industrial facilities are built yet. You can get products in the market to virtualize your own industrial automated facility, visualizing the behavior of the system in your computer screen in real time.

Virtual reality is a step further. The aim virtual reality is introducing you inside the virtual environment instead of seeing it through the computer screen. The benefits of this kind of technology are huge; however, this is not well developed yet. The problem is how to get that people perceive the virtual environment as the real one. Nowadays there are some devices to put our eyes inside a 3D virtual world but the experience will be improved a lot in the next years.

As usual, in a similar way as with any valuable technology, governments are not seeing all the possibilities of virtual technology as a solution for many problems of our societies. While some people are warning about a potential problem of unemployment due to the development of robotics, nobody is advancing the capability of generating employment of virtual reality.

Virtual reality is not only a matter of engineering. It is a matter of engineering the technology of support; however, the capability to create employment for many professions that do not fit the industry is huge. Virtual reality required a virtual environment, an environment that can be imagined by an artist.

This can be seen yet in the real economy. Although virtual reality is not developed, simulation techniques are acquiring a great importance in a real market: In the market of computer gaming.

The market of computer gaming is increasing year by year, producing the development of new electronic devices as graphics processors required to visualize in a fast way any kind of virtual simulation, increasing the market of high speed communications, and providing jobs for many artists related to imaging and writing of scripts. In fact, we can say that probably now, computer gaming market is the support of the development of many electronic devices required to improve virtual reality instead of traditional industry.

Most of computer games are created for young people. There is a segment that has not been touch related to adult public. In this kind of segment, the professions that could create jobs would be very different. Imagine a recreation of a play by Shakespeare in a virtual world where you can take Hamlet’s role. It would require knowledge of literature and history provided by several bachelors in English literature and history.

If you do not think that there would be a demand of some kind of leisure, you only need to see how movie writers have imagine the way as adult people of Star Trek waste their leisure time.

If really the industry is making more automated every day reducing the number of people working at the facilities, we should look at the possibilities that more leisure time and new technologies can provide for the employment of the future.

The way to find this future will require that governments support the entrepreneurs in the development and exploitation of new technologies instead of putting sticks in the wheels of people that are developing this kind of new technologies because they do not see leisure market as a priority for their policies. In the same way as tourism is a very important engine of the economy of countries as Spain, Italy or France, virtual reality could be in the future an important engine of the economy of any country without sun and the Mediterranean sea.



 Azul Gris 0002Mr. Luis Díaz Saco

Executive President


advanced consultancy services


Nowadays, he is Executive President of Saconsulting Advanced Consultancy Services. He has been Head of Corporate Technology Innovation at Soluziona Quality and Environment and R & D & I Project Auditor of AENOR. He has acted as Innovation Advisor of Endesa Red representing that company in Workgroups of the Smartgrids Technology Platform at Brussels collaborating to prepare the research strategy of the UE. Additionally, he is an expert in robotics and computer vision and a noticiable scientist in computer vision and brain research.


The structural differences among countries in the Euro Zone

In the last months, there has been a discussion about the model of governance of the Euro. New voices have been listened talking about the possibility of an exit of the Euro as currency for some countries due to the Greek political crisis. I have shown in several articles that the Euro construction is an irreversible process. This does not imply that a country cannot abandon the common currency. It only implies that the cost for all countries would be much higher than the benefits it could have been provided yet.

A rupture of any integration scheme increases the probability of a collapse of the economies. Of course, the effect of an exit of an economy on the rest of the system can be different depending on the importance of the economy inside the common system. Greece never was seen as a system problem, however, other countries could have provoke a total disaster in the all the Euro economies.


Graph 1 Contribution of the main Euro national economies to the GDP of the Euro Zone. Data Source OECD

We can see that Spanish economy is two times more contributing to the production of the Euro zone than Netherlands. Fortunately, Spain seems to be far from the creation of a systemic problem now that it has recovered a path of higher growth.

Euro zone is a complex scenario, because there are different policies involved in every country. The aim of the Euro is, in fact, to make easier and cheaper the international trade inside the zone. This is a way to reduce the increasing complexity that implies the creation of a common trade zone. I have analyzed this fact during several years from a complexity science viewpoint. Summarizing, the introduction of a common currency is removing a lot of structure substituting this energy in form of structure (several currencies) providing a state of maximum entropy or uncertainty (a single currency). This is not a problem although it can sound odd for some people. I have demonstrated before that a flat frequency characterization of an uncertainty distribution can be better than high frequency structure in order to control any system.

The problem of several currencies is not the number itself, but they are a source of high frequency structure when its value can change daily through trading, making the whole system less manageable by the political establishment.

The Euro is removing high frequency structure related to currency exchange; however, it does not make more uniform the productive differences among countries.


Graph 2 Different production per capita in the Euro Zone. Data Source OECD

An indicator as the GDP per capita is showing than there are differences inside the Euro zone related to different social and productive models. There is some structure related to the productive models that is preserved inside the Euro zone and it cannot be removed easily thinking only in the use of financial tools.

An interesting way of look at this is analyzing some indicator of the added value of the labor factor, for instance, the GDP per hour worked.


Graph 3 GDP per Hour Worked of the five main Euro economies. Data Source OECD

This graph is showing significantly than Italy has converged towards Spain instead of Germany or France in productivity. This indicator is not showing a bad behavior of the labor factor necessarily. It can be related to the added value of the activities made by the labor factor and then, the productive model. I will go back to this point later to show why this latter option is a better explanation of Spain.

The good behavior of the economy of Netherlands can be analyzed searching for another complexity-related indicator: the PMR. The product market regulation index, is showing barriers to entrepreneurship, trading, and so on.


Graph 4 PMR index of the five main Euro economies. Data Source: OECD

Regulation is another way to structure the economy. In this case we can see as the effect of some degree of deregulation of the real economy can be positive. This is another clear example of how sometimes structure can be worse than uncertainty. It is well known than markets near perfect competition are more efficient than markets far from it. This indicator is showing why the promotion of competition drives to a better GPD per capita index in the case of Netherlands.

If regulation can be a barrier for growth, we should look at the taxes:


Graph 5 Total Tax Revenue in the five main Euro economies. Data Source: OECD

We can see that countries with a lower taxation scheme have a higher productivity per capita. The case of Spain is very interesting to be analyzed. Spain has a taxation scheme similar to Germany but the added value of the labor factor is much lower. This is showing that the common economic policy actions as increasing or decreasing taxes cannot improve the economy themselves. It is necessary to act on the productive model. The required action of economic policy should be linked to an improvement of the productive model. For instance, reducing taxes to added value activities could be a way to promote a change of the productive model that a reduction of taxes itself will not provide necessarily.

Another interesting issue that can be seen in the previous graph is the effect of unemployment on the tax revenues. The great unemployment provoked by the crisis of 2007 in Spain has reduced the incomes from taxes by the Spanish government hugely. One of the main objectives of any government should be to make actions in order to preserve the level of employment of the people instead of thinking in taxation or public expenditure policies without an effect on the real economy.

Now, I am going to analyze the social responsibility policies of different European countries to show you that there are important differences.


Graph 6 Green House Emissions per GDP of the five main Euro economies. Data Source: OECD

The more pollutant economies are those ones which productive model provides higher added value for the labor factor. This is related to industrialization. However, again, we can find a country showing a different behavior: France. This country has not got a lot of emissions of Green House effect gases due to its bet for the nuclear energy. Nuclear waste can be more dangerous, it has not an effect on the global warming of the planet.

If we want to analyze which ones are the less polluting citizens of this group of countries of the Euro Zone we should look at the following graph.


Graph 7 Green House Emissions per Capita in the five main Euro economies. Data Source: OECD

As we can see, from a mean viewpoint, each German or Dutch citizen is a 50 % more polluting than each Spanish, Italian or French one although it is a 50 % more productive too. This is showing that the differences of productivity are more related to the productive model instead of the labor factor itself. This can be especially true in the case of Spain due to a bad support of the highest added value innovation activities for some governments. There have been important initiatives and reforms but a stronger effort is needed.


Graph 8 Percentage of graduates with a role in introduction of innovation in the main Euro economies. Data source: OECD

In order to improve the productive model and innovation it is not only necessary to promote entrepreneurship. What it is necessary is to promote the entrepreneurship and the innovation activities of the higher education staff because it will provide higher added value businesses.

Spain and Italy has a similar level of GPD per hour worked, however, Spain has the lowest number of graduates with a role in the introduction of innovation and Italy the highest figure. On the other hand, Italy has higher tax revenue over GDP and Spain the lowest one. This is showing that economic policy is not a matter of two or three typical actions, as acting on taxes or expenditure.

Economic reforms must have the aim of improving the productive model of a country, and they have not any sense alone. Economic policy is a matter of many coordinated actions searching for a common objective.


  • The integration process of Europe is highly complex. There are a lot of differences among countries, and then, there is a great need of economic policies driving to convergence.
  • Financial mechanisms are very positive, however, they cannot provide all the solutions required to reach the economic convergence.
  • Improvement of market regulation is a must in order to get better results and to preserve a process of supranational convergence.
  • There is a special need of coordination of the largest economies in order to avoid risks. This implies that certain countries must have an important presence in the coordination forums to reduce this kind of political risk.
  • Different productive models can be good if they are taking advantage of the local capabilities, however, they can be a systemic problem in the future if the differences are provoked by bad policies.
  • Social Responsibility issues should be considered too when we are analyzing the total value provided by different partners to the union.
  • Structural reforms in every country should be driven to provide a productive model of higher added value. Education and innovation cannot be forgotten.
  • Spain requires to improve its innovation process in order that its economy can converge with other Euro economies.
  • Economic policy is a matter of many coordinated actions searching for a common objective related to the search for a sustainable growth. It is much more than a matter of controlling debt and deficit.



 Azul Gris 0002Mr. Luis Díaz Saco

Executive President


advanced consultancy services


Nowadays, he is Executive President of Saconsulting Advanced Consultancy Services. He has been Head of Corporate Technology Innovation at Soluziona Quality and Environment and R & D & I Project Auditor of AENOR. He has acted as Innovation Advisor of Endesa Red representing that company in Workgroups of the Smartgrids Technology Platform at Brussels collaborating to prepare the research strategy of the UE


The political correction of innovation management

Some years ago, I attended a meeting of the managing working group of the Spanish Electricity Networks Technology Platform for the first time. One of the main issues to be discussed was the need of asking the government for better support for power electronics engineers as an important matter to provide the grids with new smart devices. In a certain moment of the meeting, one of the attendees looked at me and said to everyone there that perhaps we should not make a great effort in that point of the meeting when electricity companies hired managing engineers instead of electronic ones. I was surprised because I am electronic engineer by profession although I have additional education in business administration. However, he was right, if you look at my linkedin profile, you could think easily that is the profile of a politician instead the profile of an innovator.

I have done the military service, I have been hired by the public administration, my photograph was published in many newspapers when I was twenty eight years old because I was working in projects to improve the life of handicapped people, and even that work was referenced at all the national press and televisions. I have collaborated in the development of regulation and standards, and I have been lecturer in economic forums for competitiveness, and I have been attending international meetings for the economic development of the UE.

With this curriculum, probably I would have been an extremely good candidate to be president of the USA. In fact, most of the Spanish MP’s have not a so good curriculum for its activity because, in Spain, we are not as exigent with our politicians as the American people.

It can be true that my profile seems to be the profile of a politician instead of the profile of an innovator; however, there is a good reason for this: I have been an innovation manager at large corporations instead of innovation manager at start-ups.

I would be a good political candidate but I have not got the support of a political party because I have never searched for that kind of support. On the other hand, I would be a good engineer making innovative devices and products, at least I was it when I was younger, but I have not the image of an innovator because I have been dedicated my career to management.

This is not a godsend. As the attendees at the technology platform meeting, people will expect that you confirm their expectations. If you seem to be politically correct, people are expecting that you have got the support of a strong organization as a regular army officer, and you will find in front of you a handful of competitors with their very strong organization behind them. If you seem to be an innovator, people are expecting that you seem politically incorrect, having the strength and the willpower to convince people to adopt a very different idea to the commonly accepted by the status-quo in the guerrilla people way, undermining the arguments of competitors one step at a time.

There is a great difference between what we can name inner and outer innovation. The competences required to drive innovation are different if we want to introduce a disruptive innovation in a market or if we want to change the customs and strategies of a large organization.

This can be analyzed from a complexity viewpoint. Organizations are, due to their own nature, very structured, however, markets are much less structured and their behavior has much more uncertainty. To drive innovation inside a large organization is similar to fight in an open field (where you can see the resources of the competitor). However, to introduce an innovation in a market with a start-up is similar to fight in a jungle. You can take advantage of your competitor cannot see where you are, and how many resources can be used against its market position.

Large corporations are searching for “politically correct” innovation managers because large corporations are defining what must be considered politically correct in their markets, however, startups will require the opposite profile. A startup with a disruptive product will require a “politically incorrect” manager, someone who knows how to compete against the powerful structures of large corporations with fewer resources.

If we analyze the current economic situation, we will be able to see that the complexity of the worldwide economy has been increased in the last years. This should boost the role of innovation management for two reasons. “Politically correct” managers would be valuable in order to increase the strength of large corporations driving changes of large companies to increase the resilience of their businesses. On the other hand, “politically incorrect” managers would have their own opportunity because as market uncertainty increases there is more room for societies to adopt great changes and better conditions in order that startups be more competitive.


 Azul Gris 0002Mr. Luis Díaz SacoExecutive President


advanced consultancy services


Nowadays, he is Executive President of Saconsulting Advanced Consultancy Services. He has been Head of Corporate Technology Innovation at Soluziona Quality and Environment and R & D & I Project Auditor of AENOR. He has acted as Innovation Advisor of Endesa Red representing that company in Workgroups of the Smartgrids Technology Platform at Brussels collaborating to prepare the research strategy of the UE


October 2014: Innovation in order to reduce the complexity of the business

From a strategic viewpoint there can be a lot of routes in order to reach a certain positioning for our company. Thinking in mathematical terms, for a set of managing variables the best route to be chosen would be that one minimizing the money spent. It could be seen as a geodesic curve on a hypersurface (the subset of states that we can reach from our managing actions) of the space corresponding to the variables defining the state of the company.

I am going to draw a simple example to illustrate this fact.

Imagine a simple company with the following function of production:

o = C  x

Where o is offer, x amount of raw material

And the following contour conditions from the law of offer and demand:

d = -A p+dt

Where d is demand, dt total demand and p price of product.

The state of our simple economy is defined by two variables: x (amount of raw material for production) and p (price of the product) that define sales and surplus supposing the cost of raw material and the slope of the demand curve are constant. Making equal offer and demand to maximize benefit both variables will be linked and the state of the system will have a lower dimension (when there is no surplus).

B = Bs-S

Where B is benefit, Bs is benefit from sales, and S cost of surplus.

Understanding the positioning as the market share that we want to reach, our strategy would be defined by the percentage of demand that we want to serve. This value would define directly the price of the product and the quantity of raw material that we need to get. We have only a degree of freedom to define our strategy of prices if we want to maximize benefit.

There is a natural equilibrium point where the curves of the prices of offer and demand match providing an equilibrium value for the quantity of production, following the law of offer and demand. The optimum value would provide a unique state for our production and prices. Then, as there is only one possible state, there is not any need of a price strategy. In markets at perfect competition, the minimum price would be imposed finally in this way and our sales would be linked directly to the sales of other competitors.

The law of offer and demand

The law of offer and demand

A real economy is much more complex than this one. In a real case, the curve of demand would not be linear. There would be other competitors at the market serving the product, and if the market can be fulfilled by other competitors we cannot take the offer and demand curve as a direct reference of our sales. The cost of raw materials would be under its own curve of offer and demand too. Then, there would be many more variables involved and defining the state of system although we could only act on our price of the product and our amount of raw material for production again.

Thinking in this way we can see that the result of any marketing strategy is, in practice, unpredictable.

Strategy is not a matter of solving a set of equations, but a matter of anticipating the movements of competitors. We need a strategy because we are in competition, in other way we would need only to minimize some cost function.

Strategic thinking requires capability of prediction that it can be got only under certain states. For instance, in the previous example, if the offer of competitors can fulfil the total demand we cannot assume that we can sell something if our prices are not below the prices of competition. We can only predict sales at lower prices. This is what happens in a crisis. In this case, the demand is being reduced and only the companies that bet for reducing prices (modifying the offer curve) can control their own company in the physical sense of the word control. As production costs impose a limit on the price of the product, the strategy that fits better that situation following this economic model is a strategy of cost leading.

And what about innovation? Well, in this case, innovation is not necessarily a drawback, but innovation should be driven to improve production reducing the production cost. Process innovation would be the logic innovation related strategy. This would be to swim with the tide, or the low risk innovation strategy.

A more interesting approach is forgetting mathematical models and using model free techniques to analyze our activity. Risk can be seen as something related to perception or subjective. In the case of we are talking about mathematical risk, considering risk as not objective can be considered valid too because although mathematical risk is not observer dependent, is model dependent. Sometimes to swim against the tide can produce better results than with the tide.

If we look back the previous example, the problem with predictability is related to a fulfilled demand, then, a product innovation strategy can be a solution too, because a new product will have a new and not fulfilled demand. What are the pros and cons of this? They are obvious. Although the demand was not be very high, you can select a high price strategy that you cannot use before avoiding the limits imposed by the cost of production, and less competition can imply less complexity (less variables and uncertainty to be considered to define the state of the business), as the initial example showed.

Mature markets can become more complex than novel ones because there is a natural trend to increase entropy in every system as time goes by.

A product innovation strategy, even in a crisis, can provide simpler businesses than a cost leading strategy although many people can think that it should be swimming against the tide. The important thing is to find the proper market or market niche. Probably, they are right with the comparison. As swimming against the tide, you will need the required energy to do it, or in economic terms, the required money to do it. Sometimes, there are not only better or worse strategies but realizable and unrealizable ones too.

Swimming with the tide is comfortable, but comfort usually is not synonym of survival, because not all the competitors can win following the same strategy. The best soldiers are trained to accomplish task under the worst conditions, and there is a good reason for this.


Gris Azul 0001Mr. Luis Díaz Saco

Executive President


advanced consultancy services



Nowadays, he is Executive President of Saconsulting Advanced Consultancy Services. He has been Head of Corporate Technology Innovation at Soluziona Quality and Environment and R & D & I Project Auditor of AENOR. He has acted as Innovation Advisor of Endesa Red representing that company in Workgroups of the Smartgrids Technology Platform at Brussels collaborating to prepare the research strategy of the UE