Category: Blog

Offshore wind energy – not in anyone’s backyard
Author Ted BergmanPosted on
With today’s urgency to find quick and cost-efficient sources of clean energy, wind is expanding again. However, many producers still run into the “not in my backyard” mindset for wind production. Offshore wind energy is one solution, but there are still challenges to its widespread adoption. Luckily, new innovations are coming to the market to balance environmental concerns with commercial viability.
Entire energy supply chains are scrambling to find ways to harness the power of wind – while balancing ecological concerns with economic realities. Offshore wind energy has been gaining speed as one of the most viable sources for increasing renewable energy production.
The sea offers an open and nearly endless source of strong wind power – and is not directly in anyone’s backyard.
Why then is offshore wind power problematic?
One of the most often cited reason is the significant environmental impact that wind turbines may have on marine ecosystems. Another recently emerging bottleneck is the lack of purpose-built turbine installation vessels needed to build large offshore wind farms. Let’s take a deeper look.
As the offshore market picks up, foundations and turbines are growing to harness the energy more efficiently. Today’s turbines of 15 MW will soon be surpassed by the coming 20 MW and 25 MW turbines planned for the not-so-distant future. Also, the total number of turbines and monopiles is expected to triple in the next three years.
New purpose-built vessels take years to build and cost a lot
Purpose built wind industry vessels currently carry out most of the offshore installation work. Oil and gas vessels, too, are used when they are not prioritized to serve their intended industry. But as the turbines grow in size, so too must the purpose-built vessels to install them.
With the booming market and growing sizes, new offshore installations now face a lack of vessels. Building new purpose-built vessels with jack-up legs and heavy cranes to serve the offshore wind industry takes many years and is getting more expensive all the time.
Commonly used monopiles are not suitable for the hard rock seabed
There are other realities to consider, as well: Although monopiles are used for the majority of all wind power, they are not suitable when the seabed is hard rock, like in the Baltic Sea.
And even if Sweden, Estonia, and Finland were to embark on offshore wind, only four installation vessels in the world can currently pass over the Øresund Bridge, with its tunnel-bridge combination linking Copenhagen to Malmö, to access the Baltic Sea.
What happens in the future?
Despite the seemingly endless list of obstacles, there are new innovations coming to the market to balance both environmental concerns with commercial viability. These allow more areas globally to take advantage of wind power.
For instance, companies like Elomatic have been working on a foundation concept that can both eliminate the complications of offshore installations and avoid the negative environmental impacts.
The great advantage of the new foundation solution
The biggest advantage of Elomatic’s solution, called Float Foundation, is that it is gravity based, with no requirement for any kind of specialized installation vessel. The entire wind turbine and its foundation are all assembled at quayside with an onshore crane.
Once completed, the full structure is towed out to sea, ready to lower into its final location by ballasting it with seawater and with the help of installation barges and strand jacks. This video shows Float Foundation moving to its spot.
The heavy foundation with its unique skirt technology then dredges itself into place and penetrates any seabed material for a firm hold. It also copes well with high ice loads in Northern Baltic conditions.
Balancing ecology with economy
With new innovations, offshore wind farms can be feasible in numerous locations that do not want to endanger marine life or were previously not possible for monopiles.
Float Foundation is environmentally friendly, eliminating the detrimental noise associated with hammering and the negative impact on marine species or habitats. There is minimum soil movement or seabed excavation. Once the turbine reaches the end of its life, it can simply be floated back up to the surface and towed back to shore where all steel and other valuable materials can be recycled.
We strongly believe this concept balances ecology with economy – in any location where harvesting offshore wind is desired. It even works in areas where existing technologies have failed so far, demonstrating a competitive cost level, a minimized transportation and CO2 footprint and the lowest environmental impact.
Intelligent Engineering
Latest post
Green ammonia from Finland – a synergy of water, wind and land
Kirjoittanut By Jussi YlinenA derivative of green hydrogen produced from renewable energy, green ammonia has the potential to become a new source of energy and revenue for the Finnish national economy. It allows the country to break its...
Read more » Lue lisää »
How can we ensure industrial resilience during the coming winter?
Author Teemu TurunenPosted on
The current energy crisis has forced us to think about how to secure critical infrastructure and manufacturing operations. Concrete measures include the implementation of emergency fuel systems and moving away from natural gas in production plants. We should also examine the threat scenarios and resilience of our own activities more broadly once we have overcome the acute situation.
The term resilience has become widely used in various contexts in the media – first in regard to the COVID-19 pandemic and especially now, after Russia started a war of aggression. In general, resilience refers to the ability to withstand crises and recover from them.
Industrial resilience is also the ability to operate during a potentially protracted crisis and to develop operations in ways that ensure coming back even stronger after the crisis. I find this aspect of operational development very important, because there will always be new crises, and learning from past ones allows us to be better prepared to face future crises.
One good framework for a more holistic approach to resilience in general, and especially to drawing up measures, is the four-step model used to promote cybersecurity: Predict, Prevent, Detect, Respond.
Resilience starts in design
When investing in new production plants or developing existing ones, factors that improve resilience should be taken into consideration already at the planning stage. It is only natural that taking cybersecurity into account at a very early stage is very important, as today nearly all production is dependent on information networks to some extent.
In addition, the availability of various automation components has recently been subject to delays of several years, at worst. For this reason, it may be necessary to design alternative solutions when considering new investments, in order not to be entirely dependent on a single supplier.
Buffer capacity for increased operational flexibility
Simple ways of increasing the “physical resilience” of an investment at the design stage include
- several parallel energy production methods
- ensuring sufficient backup power capacity
- duplication of critical systems and
- an optional ring main unit for electricity distribution.
Operational flexibility can also be promoted by designing sufficient buffer and accumulator capacity to be able to continue the process after a deviation occurs. This buffer capacity may make it possible to avoid periods of peak electricity pricing. This way, equipment such as dairy refrigerators or refiners of board mills do not have to run at times when energy is expensive.
Making use of inexpensive energy with smart management
Adding predictive and intelligent management to storage and buffer capacity enables the use of these systems designed for disruptive situations even under normal conditions, to improve the efficiency of business operations.
Predictive management of energy-intensive processes enables activities to be concentrated in hours when energy is inexpensive. This is a good example of how crisis preparedness can strengthen operations further even after a crisis.
Naturally, we cannot be prepared for everything, and the development of resiliency is always a continuous process. It is important to remember that investing in design and technology in the planning stage will always be significantly less expensive than altering systems after the fact – or especially during an ongoing crisis.
Intelligent Engineering
Latest post
Green ammonia from Finland – a synergy of water, wind and land
Kirjoittanut By Jussi YlinenA derivative of green hydrogen produced from renewable energy, green ammonia has the potential to become a new source of energy and revenue for the Finnish national economy. It allows the country to break its...
Read more » Lue lisää »
Electrification of industry is the way to carbon neutrality
Author Teemu TurunenPosted on
The electrification of industry is an important element in reducing greenhouse emissions, and the role of heat pumps is central in this development. It is particularly important to look at things from a broader perspective and utilize waste heat. In the best case scenario, several operators can benefit from the solution. However, as the processes become more complicated, the importance of their control must be remembered.
Electricity consumption in Finnish industry is expected to grow significantly. According to Fingrid’s extreme scenario, it will even double by 2030. Of course, more moderate developments have been outlined, and growth also depends on the industrial sector.
The forecasts do not come as a surprise as the electrification of industry plays an important role in reducing greenhouse emissions. Another driving force of electrification, security of supply, has come to the fore since the outbreak of the war in Ukraine. There are already large-scale projects under way, such as SSAB’s HYBRIT project, which represents electrification on a larger scale.
At its simplest, an old solution is replaced with an electric one
Electrification refers to a situation where, for example, a process equipment that uses fossil fuel is replaced with an electric solution: for example, a gas-powered forklift is replaced with an electric forklift or a regular boiler with an electric boiler. Although this may sound like an easy solution, it is always important to anticipate the effects of the change on the process. For example, making bread is different with electric and gas ovens.
Electrification can also be implemented indirectly. In this case, electricity is used to produce, for example, hydrogen or synthetic fuels. The hydrogen economy is rising fast, although it will probably take the next decade before it has an impact on our entire energy system.
Heat pumps play a key role in the electrification of industry
An element of energy efficiency comes into play in the world of heat pumps. The benefit is often realized by taking into account wider entities, and eventually the whole process. It is essential that the waste heat can be utilized.
The possibilities of pumps also include that cooling and heating can be done with the same system and the benefits can be distributed to several parties. In this case, there is a move toward sector integration where, for example, there is an industry player at one end and an energy company at the other – and both benefit.
The importance of controlling processes increases
Profitability is important in industry, and that is why a lot of projects are needed: what works in theory is not always financially viable. However, the development of technology opens up new possibilities as higher temperatures can be reached with heat pumps. In the future, the role of process control will also be emphasized when moving toward more complex systems.
However, energy cannot be discussed without mentioning politics. At the moment, it is difficult to predict the price of energy when both production and the market are fluctuating. Great things can still be achieved if one keeps the big picture in mind: the subject should always be approached as a whole.
Intelligent Engineering
Latest post
Green ammonia from Finland – a synergy of water, wind and land
Kirjoittanut By Jussi YlinenA derivative of green hydrogen produced from renewable energy, green ammonia has the potential to become a new source of energy and revenue for the Finnish national economy. It allows the country to break its...
Read more » Lue lisää »
The benefits of using pushover analysis in earthquake engineering
Author Lauri SaarelaPosted on
In daily earthquake engineering, most earthquake analyses use linear or quasi-non-linear methods, where the non-linear behavior of structures is not taken into account explicitly during the analysis. The earthquake engineering field has long been in need of a performance-based analysis method that would also capture these non-linear effects i.e. inelasticity of structures.
Pushover analysis is a non-linear static method used in seismic assessment of buildings. It takes into account the non-linear behavior of structures and thus fills the void that usage of linear analysis types leaves. Even though pushover analysis has been around for a long time, it is still not very widely used in daily engineering. While linear analyses fail to meet the requirements regarding depiction of non-linear behavior of steel structures, pushover analysis is a valid tool for assessing the stiffness, strength and ductility/resiliency of a steel structure in the inelastic range. It also meets the simplicity requirements that are often present in daily engineering projects since the most accurate method, non-linear dynamic analysis, requires complicated data and is not simple enough to be used in daily engineering.
During an earthquake, a building oscillates back and forth, and certain structural elements are meant to absorb the oscillation energy. If the earthquake is strong enough, these structural elements may yield and buckle in the process of acting as fuses to absorb most of the shock, while leaving important load-carrying elements in the building intact. In non-linear dynamic analysis, the whole oscillation history of the structure is analyzed. Pushover analysis is a static method, which analyzes the single worst possible oscillation (target displacement) that the structure would have during the earthquake. Pushover analysis allows for a detailed estimation of which structural elements yield and buckle during the earthquake and how and when these plastic mechanisms develop. In other words, the capacity of the building can be determined.
The capacity of the building is depicted as follows: as the top-story displacement value is increased up to a certain value (target displacement), reaction forces at the base of the building (base shear) also increase. When a curve is plotted where the target displacement value is at the horizontal axis and the base shear force is at the vertical axis, a characteristic capacity curve, or a pushover curve, of the building is formed. An example of a pushover curve is shown in the figure below.
The design loads in elastic analyses that are derived from elastic earthquake spectrums, which are reduced to design spectrums using a so-called behavior factor. The behavior factor of a structure depicts the non-linear behavior, or over strength, that the structure possesses after the elastic capacity has been reached. These behavior factors are usually taken from codes and there is discussion in the literature, whether these factors are always correct. Pushover analysis allows for more realistic and structure-performance-dependent re-evaluation of the behavior factor as opposed to taking the behavior factor from a code just based on the structure type an incorporating it into an elastic analysis.
Pushover analysis is a very practical and reliable tool when applied correctly. It does not require complicated data. Combined with the N2-procedure in Eurocode 8, pushover analysis can be a valid tool for determining the target displacement, base shear loads, plastic mechanisms and capacity of a structure that is symmetric and low in elevation. It is especially useful in estimating which elements of a structure would fail under an earthquake loading and in determining how it affects the global stability of the structure.
The animation below shows a transient analysis on the left and a corresponding pushover analysis on the right. The animations have been synchronized so that as the highest value of oscillation occurs in the transient analysis, the pushover analysis animation is run to the same value following the same displacement pattern. The plastic mechanisms forming in the pushover analysis are similar to those in the transient analysis.
Tags
Avainsanat
Intelligent Engineering
Latest post
Green ammonia from Finland – a synergy of water, wind and land
Kirjoittanut By Jussi YlinenA derivative of green hydrogen produced from renewable energy, green ammonia has the potential to become a new source of energy and revenue for the Finnish national economy. It allows the country to break its...
Read more » Lue lisää »
Evolution of autonomous maritime operations driven by automation technology and digitalisation
Author Kimmo MatikkaPosted on
Background, prerequisite and first visions of autonomous ships
Radio communication for maritime industry started to develop in the beginning of 1900. 1910-1970 was the period for development of radio communication, gyro compass, radar and heading control. By marine electronics development, 1970-1995 route graphics on radar display, track control, conning displays, GMDSS and GPS and electronic chart systems were developed.
Development of integrated navigation systems started already in 70’s but, it can be said that the navigation system integration really started in the beginning of the 80’s by available electronic devices and systems, followed by speed control and AIS-systems in late 80’s. Study of “Ship of the Future” started in 1980.
Meanwhile navigation and communication systems got more automated, control and functions of machinery systems started to be based on automation as well, thanks to better sensor technology and computers. Today, in modern vessels, almost everything is controlled via ship’s automation system (IAMCS) with protective functions for the equipment. The system might take care of thousands of I/O channels. That alloys automated HVAC systems, diesel electric propulsion and power plant with Power Management System (PMS) etc. Everything is mostly automated which the ship’s crew controls and makes corrective actions and orders.
Vision of autonomous ships published already before ship systems started to get automated. As early as 1970´s, vision of autonomous ships was launched by Rolf Schonknecht in his book “Ships and Shipping of Tomorrow”.
Progress of autonomous maritime operations development
There are several names and definitions for different types of automated ships with different level of autonomy; remotely controlled, unmanned, autonomous etc. IMO has also defined different levels for Maritime Unmanned Surface Ships (MASS). Unmanned ship can be remotely controlled or autonomous, but autonomous ship must be able to operate without human assistance, though it is monitored by crew onboard or shore based Shore Control Center (SCC). As nowadays, most of the ship systems are highly automated, there is still a lot of work to create such a complete and absolute wholeness with algorithms, artificial Intelligent and machine learning, to fulfill all functions and circumstances which might occur in maritime operations. Maritime environment is much more complicated and variable than the road network onshore. The responsibility matter, if something unexpected happens, is not adequately solved yet.
Japan had a project in 1982-1988 to develop a highly reliable, intelligent and automated operational systems in maritime operations, remotely controlled from shore based control station. Korea started to research an Unmanned Autonomous Surface Vessels for maritime survey and surveillance in 2011. European MUNIN project 2012, REVOLT by DNV-GL 2013-2018, AAWA project led by Rolls-Royce 2015, Lloyd’s Register Guidance 2016, MOL project for Autonomous Ocean Transport System 2017, just to mention few projects. The World’s first Autonomous Shipping Company Masterly established in 2018 by Kongsberg and Wilhelmsen.

2019-2020 some full scale (tug boat and archipelago ferry) tests of remotely operated and autonomous vessels have been performed. Thus some companies have had plausible and illustrative demos about autonomous vessels for high seas, eagerness has settled down towards short voyages close to the coast, rivers and city areas. For long term, environmental friendliness and ecological efficiency, might cause restrictions for propulsion power and speed, which might support the struggle towards autonomous and unmanned shipping.
Technology companies and developers have been worried about slow reaction of authorities to get this issue forward, to achieve regulations for autonomous shipping operations. But as seen, local authorities gave permit to test autonomous operations in certain pointed areas. They don’t want to prevent the development progress, but technology drives the way. Around 2015-2020 hype of autonomous ships pushed IMO to concentrate on legislation for autonomous vessels more deeply. Regarding regulation, IMO takes first step to address autonomous ships and defines Maritime Autonomous Surface Ship (MASS) in 2018, DNV GL released guidelines for Autonomous and Remotely Operated ships. Now at IMO, there is an evaluation projects for needs and content of regulation for autonomous ships and operations according to their strategic plan 2018-2023.
Automation technology of ship systems lead the way of autonomous maritime operations
Ship system providers develop automation in their own product portfolio and that is probably strongest accelerator in contest of achieving autonomous systems and fighting against challenges in this complicated operating environment. Few years ago, hype of autonomous ships and digital transformation spread in many countries, not least in European countries. There have been several research and development projects, tests and real projects to build autonomous ships as well. Nevertheless, it seem not to be as easy. Well known project “Yara Birkeland”, 120 TEU container vessel, is one example of the complexity of this kind of vessel and operation. Even it is meant to sail only short voyages in restricted area at coast of Norway, the wholeness is still complex. The project is still alive and most probably to be succeeded, but to be postponed. Initial goal was to start operation in 2019 and to be fully autonomous in 2020. The latest published schedule is to be in test run period during 2020, in operation 2021 and fully autonomous in 2022. The hull is already on the way to delivery shipyard for outfitting. The Yara Birkeland is aimed to be zero emission vessel, and also loading and unloading is planned to be autonomous.

https://www.yara.com/news-and-media/media-library/image-library/
MacGregor responded to their customer request to develop an autonomous crane for ESL-Shipping. The autonomous discharging crane system for bulk cargo operates driverless, controlled from the command bridge. That is one impressive example how the wholeness is knitting together by individual equipment and systems, one by one.
Today, there are such a wide range of automated and autonomous systems available, that major players have good opportunities to integrate and deliver all the needed systems for fully autonomous vessels intended for short voyages in restricted areas. Vessels to be more integrated and connected by digitalization, they come more autonomous regardless they have crew on board or not. Lot of effort has been made to develop algorithms and AI. For example, ABB, Wärtsilä and Kongsberg with partners have all the opportunities to achieve this goal in the near future. It is stated in many sources, that autonomous vessels will sail within next few decades. Nevertheless, it will grow up step by step, according to the development of individual system provides. Next steps to be, is to have autonomous systems which supports the crew in their actions and decision making. But in the beginning, crew onboard or at control stations ashore, utilizes the support of autonomous systems and still have the control of the ship. Autonomous shipping and autonomous maritime operations doesn’t only mean the autonomous ship. Fairways and harbor areas have to integrate to wholeness of autonomous operations too, including piloting and VTS control, berthing and cargo operations.
Regardless of the operational format, autonomous, remotely operated/controlled or traditionally operated by crew of the ship, the vessel need to be engineered any way. Elomatic Consulting and Engineering Ltd is looking forward to be a partner for ship owners and shipyards in ship design engineering processes in all phases, meeting future needs.
Elomatic has competitive teams for hull design, outfitting and machinery engineering, and for electricity, automation and interior design as well. Elomatic has also a strong team for visualization and simulation. Personnel of Elomatic consist of versatile experience with strong knowledge of design engineering and project management, and operational experience as well, like deck officers, master mariner (Captain) and maritime engineers with long sailing career.
Intelligent Engineering
Latest post
Green ammonia from Finland – a synergy of water, wind and land
Kirjoittanut By Jussi YlinenA derivative of green hydrogen produced from renewable energy, green ammonia has the potential to become a new source of energy and revenue for the Finnish national economy. It allows the country to break its...
Read more » Lue lisää »
Does the ship design project benefit from Digital Twin approach?
Author Juhani KankarePosted on
I have been thinking what benefits shipowner and yard gets if they start concentrating to digital twin approach at early design phase. At least then virtual ship will be matching the real one more accurately. It will be ready with less effort at the end; when it has been built during the design phase. Project management can follow visually the maturity of the design. But can the ship design project’s decision making also benefit from the digital twin concept?
Safety and Emergency scenarios
For example emergency related scenario animations like safety and evacuation simulations. This information will be beneficial for designing the ship but also later when training the crew. Having good visual animation helps all parties to understand better what kind of issues or bottle necks passengers or crew faces in real life. At the design phase the designer can find out better solutions when seeing animations from different scenarios to ensuring most secure options.
Smoke and flow simulations
Other example is visual animations from design phases like CFD results of selected technologies. Exhaust gas animation, smoking areas or other similar simulations may help later the crew to understand limitation of activities.
Design reviews, architect reviews
Digital twin supports more effective decisions making due to level of details possible to review. Architect and owner can have a virtually review of different areas like restaurants, casinos or cabins. The virtual review enables possibility to have many different solutions compared to physical models. It also allows people to use more time with the sample designs due to they can use it at the own office via VR or just from own computer display. The building up virtual sample of areas is much faster and cost efficient way than building in real samples. Also there can be many different solutions in a review when there are no limit of physical space. Later these selected models can be opened and re-check if needed.
Decisions faster and more accurately
Design- and architect reviews are held due to need of making big decisions related to manufacturing and equipment investments. Using digital twin reduces costs and risks due to better understanding of the details and gives more tools to make accurate decision. Having access to design information is good but the ability to visualize information provides additional level of knowledge.
Production phase
Often the design areas are divided to many small parts in case of technology and physical location. It is beneficial to have direct link between production and design houses. Digital twin can be used to point problematic areas with your virtual finger on discussions. The ship builders can manage changes more efficient way when the errors can be corrected faster and even with less bureaucracy.
It can be also set up alerts to some areas or parts to show problems in specific areas between production and design. Project management may follow the progress visually from virtual ship.
Nowadays there are need to set up systems supporting global activities since there can be many limitations to physical meetings. Digital Twin may even allow sometimes reason not to meet when the information is commonly available via cloud to everybody’s terminals.
Intelligent Engineering
Latest post
Green ammonia from Finland – a synergy of water, wind and land
Kirjoittanut By Jussi YlinenA derivative of green hydrogen produced from renewable energy, green ammonia has the potential to become a new source of energy and revenue for the Finnish national economy. It allows the country to break its...
Read more » Lue lisää »
Fuel cells
Author Tobias ErikssonPosted on
The marine industry is well-known to be a significant source of harmful local (NOx, SOx and particulates) and global (mainly CO2 and CH4) emissions. The high emissions are a result of the traditional low grade “bunker fuels” used, which mainly consist of residuals and low-grade distillates. In recent years, however, public pressure regarding air pollution and climate change has caused governments and authorities to take action to reduce them. As a result, stricter regulations encourage the ship industry to find new clean fuels and/or energy efficient solutions to meet the progressing limits on pollutant emissions.
Making the transition to an alternative fuel in shipping is a major undertaking, but likely required if the set emission reduction goals shall be achieved. As the marine industry is finding itself under pressure on mitigating harmful emissions, it is no longer a question if, but rather when a transition to alternative cleaner fuels will take off. When it realizes, it can offer a good fundament for introducing new alternative technologies for power production onboard. One interesting and potential technology that could play a key role to mitigate harmful local emissions from ships are fuel cells. Fuel cells have, similar to hydrogen experienced a hype cycle before, but thus far the commercialization and widespread interest in marine use has remained low. This, however, could change if new fuels in shipping are to be introduced.
Fuel cells are electrochemical conversion devices that convert the chemical energy of a fuel directly into electrical energy. The process of converting chemical reactions directly into electrical energy offers some unique advantages compared to an internal combustion engine, where conversion of chemical energy goes through thermal and mechanical work before converting into electrical energy. Since there is no combustion involved, fuel cells can produce power with less formation of pollutants. Furthermore, due to the lack of internal moving parts in fuel cells, means they generate very little noise and vibrations during operation. Fuel cells could therefore be an effective solution to also reduce the destructive low-frequency underwater noise radiated from ships machinery, which is known to have both short- and long-term negative consequences on marine life.
A basic fuel cell mainly consists of three active components: two electrodes, i.e. an anode and a cathode, and an electrolyte sandwiched between them. To produce electricity, fuel is fed continuously to the anode, and an oxidizing agent, typically air, is fed to the cathode. The electrochemical reactions take place at the electrodes, producing an electric DC current through the electrolyte. The working principle of a fuel cell resembles a battery. But unlike a battery, which consumes its reactants and oxidant and must be recharged when depleted, a fuel cell will continue to produce electricity as long as fuel and oxygen is supplied to the cell.

Fuel cells can be classified into different categories, but the most common classification is based on what electrolyte is used and includes five major groups:
- Alkaline fuel cell – AFC
- Phosphoric acid fuel cell – PAFC
- Proton exchange membrane fuel cell –PEMFC
- Molten carbonate fuel cell – MCFC
- Solid oxide fuel cell – SOFC
Regardless of which electrolyte is used, a fuel cell system consist of more components than the fuel cell stack itself. The additional auxiliary components required to generate the electrical power from the stack are often referred to as the balance of plant. The stack and balance of plant together form what is generally referred to as a fuel cell unit, or a fuel cell system.
Forget the typical ship machinery compartment layout. Fuel cells modular design give them some unique features that could revolutionize the conventional shipbuilding. Unlike a combustion engine, fuel cell efficiency and load factor is not dependent on system size, which means the performance of a single cell is not different from a large stack. The excellent modularity means that the range of application from less than 1W power outputs up to multi-MW power generation systems is possible. This neat feature means the power production from fuel cells can be distributed over the ship as smaller units, e.g. one fuel cell module per main fire zone. The advantage of a de-centralized powerplant solution is reduced electricity transport losses and improved redundancy.
Hydrogen would be the ideal choice for fuel cell operation, but the characteristics of hydrogen arguably makes it a challenging fuel for marine use. In general, fuel cells can only utilize fuels that are hydrogen-rich and in gaseous phase. This essentially means that none of the current used bunker fuels (i.e. HFO and MGO) are compatible with fuel cells. Fuel requirements also vary depending on the fuel cell technology but typically, the lower the operating temperature of the fuel cell is, the more stringent are the fuel requirements. The chosen fuel will ultimately decide which fuel cell technology may be suitable to use. With so many different types of ships on the water, however,” no one-solution-fits-all” exist. For instance could the hydrogen-fuelled, low temperature fuel cells such as the PEM-based fuel cells be suitable candidates for ships with short autonomy requirements, e.g. small ferries and river boats. Whereas for deep-sea shipping and ships with high power demand (e.g. cruise ships) on the other hand require significant volumes of fuel onboard, which quite quickly rules out hydrogen as a suitable fuel. For such applications the high temperature fuel cells like the solid oxide or molten carbonate fuel cells could be more viable options, as they are more flexible in fuel choices and could utilize fuels already relevant for shipping, e.g. LNG, methanol, truck diesel, and also ammonia has been proven feasible options. The higher operating temperature of the SOFC and MCFC also means high-quality waste heat can be recovered, which further improves the overall plant efficiency.
Fuel cells are still a novel technology, which also is reflected in their price. They are expensive, and not just slightly, but an order of magnitude too expensive to be competitive. To promote fuel cell development, cost reduction, and market deployment, fuel cells are still dependent on state and federal tax incentive programs to help offset their current high system cost. The two major reasons why fuel cells are so expensive are also closely interlinked: low production volumes and expensive constituent materials (such as platinum in some fuel cells). Without widespread adoption of the technology, prices will remain high. The high cost is also closely related to the relative short lifetime of the current fuel cells stacks, which varies from 2000 up to 40 000 h depending on the technology and application. In marine applications, where the powerplant basically is close to year-round in operation means stack replacements would have to be carried out quite frequently with current expected lifetimes. In several lifecycle analysis, however, fuel cells have proven to be cost-competitive, if the key issues related to the initial high investment costs and expensive periodic stack replacements could be addressed.
Adapting fuel cells to marine applications is technically challenging, but by no means unfeasible. Earlier demonstration projects have proven that majority of the fuel cell technologies are suitable for marine use. The biggest chicken-and-egg problem the fuel cells are facing, however, is the lack of fuel cell compatible fuels in the marine industry. Fuel cell suppliers may therefore not see the marine market large enough to be a strategic and profitable investment, and the interest of developing fuel cells specifically for marine use has not been prioritized. Additionally, there’s the common dilemma for any new and unproven technology that “no one wants to be the first customer, and no one wants to be the second or third either”. To overcome these issues, a deep understanding of the market and a striking idea is needed. At Elomatic, we can help with both, let’s collect the pieces of the chicken-and-egg puzzle together!
Intelligent Engineering
Latest post
Green ammonia from Finland – a synergy of water, wind and land
Kirjoittanut By Jussi YlinenA derivative of green hydrogen produced from renewable energy, green ammonia has the potential to become a new source of energy and revenue for the Finnish national economy. It allows the country to break its...
Read more » Lue lisää »
Why Digital Prototype?
Author Karl-Kristian HögströmPosted on
Elomatic and Devecto have developed a method to combine the 3D world and the physics model of a machine together into a realistic digital prototype. In this article, our guest writer Karl-Kristian Högstrom will talk about the method, the technology behind it, and the benefits it offers in R&D.
Minimizing the time spent iterating between development and testing is crucial. Especially when developing complex machinery, where prototype building is a tedious process that requires lots of time. Use of agile methodologies and fail-fast principles in R&D help, but further improvements are needed. Use of digital prototypes can be this next step.
For a digital prototype to be useful, we need to be able to examine it from different points of view – we need both “the look” and “the feel”. 3D models and virtual landscapes offer “the look”. We can evaluate how our machine looks on the inside and outside, and to some extent the usability. The evaluation of performance is not possible with a 3D model alone. However, “The feel” can be obtained by modelling the dynamics of the machine mathematically. By connecting the dynamics model to the 3D model, we can have both the look and the feel in a realistic digital prototype that can be used for testing and evaluation prior to building the physical prototype.
The joint power of Unity and Simulink
There are many proprietary simulation systems available, and there has certainly been a need for simulation systems that are easy to use for a specific purpose. Typical example is a training simulator. However, the development possibilities with a proprietary system are always limited and/or require special knowledge.
Elomatic and Devecto have developed a method called The Link to combine the 3D world and the physics model of the device/machine together into a realistic simulation with commonly used and widely available tools. The 3D world with a realistic landscape is running in Unity. The physical properties of the machine are modeled in Simulink. These two models are connected so that information flows both ways. Unity and Simulink as development tools are improving all the time and the development of a simulation system with them is already very efficient. The system is not limited in any way to a specific purpose, and simulation can be used as a tool in various phases of the product development.

The example presented here is an imaginary Terex Fuchs material handling machine. The control system, hydraulics and mechanics of the machine is modeled in Simulink. The machine in virtual landscape is operating in Unity. There is a dashboard that shows values from the machine model such as hydraulic pressures in cylinders. In the dashboard there is a possibility to adjust parameters in the Simulink model. Collisions and other interactions with the virtual environment can flow from Unity to the Simulink model.
Learn more about digital prototypes!
Homepages: The Link – Digital Prototype
Upcoming event
Simulink Forum, 12.2.2021 at 9.30-12.30 EET.
Theme: “Using simulation and digital prototypes in different stages of product and development”
Agenda:
- What is digital prototype? + demo, Harri Laukkanen, Devecto & Jani Moisala, Elomatic
- Simulation as a part of testing and quality assurance, Antero Salojärvi, Avant Tecno
- The benefits and the challenges of modelling, Aleksi Vesala, Valtra
- Utilizing models in various stages of product and development, TBD Mathworks
Intelligent Engineering
Latest post
Green ammonia from Finland – a synergy of water, wind and land
Kirjoittanut By Jussi YlinenA derivative of green hydrogen produced from renewable energy, green ammonia has the potential to become a new source of energy and revenue for the Finnish national economy. It allows the country to break its...
Read more » Lue lisää »
The way we work has changed. Are you ready for a post-pandemic life?
Author Rose CugalPosted on
Reduced demand, disrupted supply chains, furloughed workforces, and mandated shutdowns have caused businesses around the world to adapt to the new normal. As we move towards a post-pandemic life, it is crucial to know your options and the alternative methods to communicate, educate, and do maintenance. In this article, I will talk about how you can adapt to the new normal using the latest technology to reduce pressure and keep your business viable.
Visualization is now an acknowledged and critical tool for industrial leaders. Extended reality (XR) is a visualization method and is a general term for virtual reality, augmented reality and mixed reality (Read more about XR here). Not long ago, these technologies were just a fantasy to most, but nowadays these are applicable to many industrial needs. Modern gears have better performance with an affordable price tag. Major trends, globalization, digitalization and sustainability lead into the use of XR technologies at a larger scale. XR offers benefits that not only extracts many challenges, such as fast iteration and language barriers but it is also an environmental friendly option, due to reduced need for physical prototypes and travelling.
Quicklinks:
- Remote work will continue to grow, but what will happen to customer engagement?
- Virtual prototypes leads to smarter and cheaper product development
- Faster onboarding, training and reduced work-place casualties
Remote work culture will continue to grow, but what will happen to customer engagement?
Remote working and social distancing are main effects of the pandemic. The lack of human interaction and travel restrictions have certainly been a challenge for all industries and has changed the way we do work.
According to a research in early 2020, 55% of US employees would prefer a mixture of remote and onsite working. In the UK, the estimate for remote workers has doubled from pre-pandemic 18% to 37% post-pandemic. In China, an employment expert Alicia Tung has estimated that the onsite/remote work ratio will be 60/40 within 10 years.
How can we keep the same level of performance and engagement while keeping the perks of working from home? Extended reality offers a strong alternative to unreliable and impersonal calls. Hosting online webinars has increased drastically during the pandemic. Although this is a good way to engage with your audience, it is not a long-term alternative to fairs. The biggest benefits of using XR is its ability to connect people and make it feel as if they were in the same space regardless of their location.
A crucial part of human interaction are non-verbal communication, such as, one’s voice tone, gestures and facial expressions. In most cases, people have their video cameras off. People multitask and their attention aren’t fully in the topic. Using virtual reality in communication, teamwork and customer engagement removes the additional distractions. Participants can be truly engaged and collaborate in the same virtual environment in real-time. 3D objects and animated procedures can be brought into the virtual environment, offering a chance for participants to be involved with your whole site, process or machinery in full detail.
Smarter product development and less physical prototypes
As previously mentioned, the changes in our way of working are also due to global megatrends, such as globalization. A modern team has distributed workers and remote collaboration is common in today’s economy. A well-functioning team and synchronized working is important. There is a need for systems that can keep up with the correct data, real-time changes and information to manage evolving and changing projects.
The processes can be simulated in virtual environments for lean validating and development. Creating digital prototypes can ensure that the products and processes are tested and optimized. The design, validation and even training can be done with digital prototypes before anything physical is built. This removes possible trials and errors. By linking the real with the digital, the design and user experience can be tested in real use-scenarios, the customers get to see what they are purchasing and the users can be trained in advanced regardless of their location.
Industry example: The Link – Digital Prototype
Devecto and Elomatic developed a method that links the 3D world with the physics model of a machinery creating a realistic simulation that is easily accessible and controlled with common tools – PC and console. The method called The Link, connects the two models so that information flows both ways. By linking the dynamics model to the 3D model, both realistic visuals and authentic feeling are achieved in the digital prototype that can be used for testing and evaluation prior to building the physical prototype.
Faster onboarding, training and reduced work-place casualties
The use of extended reality in employee onboarding and training has increased in the past few years in different fields, and will continue to be more common in the coming years. As remote work increases, the need for flexible training methods are necessary.
Training and employee onboarding can be easily done in interactive virtual reality environments, as procedures can be simulated in realistic environments that are true in size. This gives the chance for better performance and less work-place casualties. Furthermore, research shows that memory retention after a VR experience is twice higher than after video- or text based learning materials.

Using virtual trainings, imperfections in safety procedures can be spotted and additional safety steps can be easily added which leads into improved work-place safety. Embedded visual information will help the user learn and understand at their terms. Maintenance and installation tasks can be trained beforehand to avoid maintenance break and decrease down time.
The pandemic has surely affected the way we work and has increased the need for XR solutions. While the future might be unsure, the increasing role of XR in different industrial fields is secured.
Intelligent Engineering
Latest post
Green ammonia from Finland – a synergy of water, wind and land
Kirjoittanut By Jussi YlinenA derivative of green hydrogen produced from renewable energy, green ammonia has the potential to become a new source of energy and revenue for the Finnish national economy. It allows the country to break its...
Read more » Lue lisää »
Visualization in passenger vessel design – Virtual world gaining on physical
Author Lasse RaakavuoriPosted on
Visualization is an incredibly versatile concept, and with new and evolving technology, it is safe to say that the usage of visualization will only increase and diversify further. It has the potential to become a superior tool of communication from initial ship concept to optimizing the operation an existing ship. But to keep things simpler, let’s focus on what visualization can bring to the table already today, and narrow the scope to cover something a cruise vessel has a lot of and can’t live without: interior spaces.
Table of content:
- Offering a positive customer experience through interior design
- Why psychical mock-ups are slowing you down?
- Virtual mock-up saves time, effort and costs
- Key factors for a successful and realistic virtual mock-up
- Creating added value with simulation and embedded information
Offering a positive customer experience through interior design
Interior design is a make-or-break -factor in the success of a commercial passenger vessel. If a cruise does not attract potential customers with aesthetic cabins, captivating public spaces and a plethora of exciting activity possibilities during voyage, they will unlikely turn into passengers. Cruise trends evolve and fluctuate, but one ever-remaining constant is that the passengers’ experience must exceed a stay in a regular hotel on dry land, or they can’t be expected to be seen onboard.
Due to the large cruise offering by various companies, a constant question in Shipowners’ heads is how to make customers return to a cruise they have already been to, or at least pick their next cruises from the same company. The simplified answer to that is positive customer experience. Each solution in any shipbuilding discipline affects the overall performance, safety and effectiveness of a cruise ship, but no other decisions affects the customer experience as much as decisions concerning interior design. That’s why it is a sector in which it is crucial to invest.
Why psychical mock-ups are slowing you down?
Usually, shipowners outsource the design of interior (and largely also exterior) spaces to architects and interior design offices. Their work is refined down the line by marine design professionals i.e., engineering offices, so that all regulations are adhered. After the designs are complete, the work is transferred to TK suppliers. With their experience, many materials and structures can still undergo changes to assure that the level of complexity and budget is not exceeded. Before final approval, TK -suppliers need to make physical mock-ups of several items on a space, or even complete small space such as a passenger cabin. The notion to take away from this process is that designs are in a state of flux for a long period and start to freeze only until something physical is produced. The reason for this is that until then, it is hard to fully grasp how a certain space or an item inside it works and looks like, at least with the current usage of visualization tools.
In the viewpoint of visualization, the only delivered material in most cases are 2D renderings made by architects. This material is meant to augment i.e., AutoCAD drawings. The most valuable thing they show are the materials and textures used in a space. However, since these renderings are not made by marine design professionals, and the emphasis is only in delivering visual information, not solving technical issues, they often contain impossibilities, and the structures are simplified. This results in the material’s lowered value. Overall, this may result in visualization to be considered a low-importance activity in shipbuilding, used only as marketing material for the interested and “nice to know” -information to the designers and builders of the vessel. This is an outdated take, because visualization can be a valuable, time- and money-saving instrument if the toolset (and mindset) is updated to its potential. The spearhead for the toolset is virtual reality (VR), and the goals which it can achieve already today are speeding up design, cutting costs and creating a basis for early decision-making.
Virtual mock-up saves time, effort and costs
Virtual reality is not a new or exclusive thing. Many people have already had some form of contact with it. Shipowners and shipbuilders may have tested it, or even procured their own gear. Architects can also nowadays provide VR -scenes of the spaces they have designed. The general problem, and the hinderance to further expansion seems to be the underwhelmed experiences and limited values of the results. But the value of the results is in direct correlation with the amount of effort.
It is rather simple (read: cheap) to create a VR scene from an already existing 3D-model. Very commonly that is where the effort stops. The viewer can immerse in the space, but it is static. Nothing moves or interacts except the viewer. Lack of effort causes the viewer to be able to casually walk-through walls and furniture leading the experience then to feel unrealistic. Top that off with a low-end headset with an eye-irritating pixel screen and the result is a bored viewer, who can’t wait to finish, because of a looming headache. Ironically, if things are done differently, the experience is the total opposite. That is why it’s important to convey what can be achieved by adding a little more effort.
EFFORT DIFFERENCE IN PHYSICAL MOCK-UP VS. VR MOCK-UP
But first, let’s put some perspective on the effort. It is only feasible to study the effort level a high-quality VR -space needs if it is paired up against the same levels of a physical space. A crucial fact to remember is that a virtual mock-up can be made as soon as initial plans are ready with a fraction of the effort compared to a physical mock-up. This means that valuable information is available much sooner in the project and money is also saved by making decisions and correcting mistakes before any physical production. Once the elements are created, it is even faster to make changes whenever the need arises. Nothing needs to be gathered, shipped, assembled and disassembled. There are no material costs. When these aspects are stacked up, it is clear to see that there is a lot of headroom for virtual working before even approaching the effort needed for a basic physical mock-up. Additionally, the VR space can be viewed from wherever on the globe and by as many people as needed without travelling.
Key factors for successful and realistic virtual mock-ups
The key aspect for utilizing VR as a valid tool is to increase information value. Elomatic builds all VR scenes with a physics engine, so they can be made interactive. This means that doors, windows, cabinet drawers etc. can be opened and closed and furniture and other items can be moved with a virtual hand. Walls can’t be penetrated, if the viewer walks forward towards a wall, the movement is stopped in the VR scene. It is also possible to insert a sound effect implying contact. This opens up spatial studies of e.g., passenger cabins in another level.
The only thing still lacking from the VR experience is touch. But with the features previously mentioned, this is becoming more and more unnecessary. However, if it is imperative that that walls etc. can be touched, a simplified physical space with correct dimensions can be built with simple materials such as plywood. in there the viewer can walk around with a VR headset on and touch a wall where it is also virtually seen. Still, we remain under the costs of a full-on physical mock-up.
Different layouts, textures and appearances can be compared back and forth with a flick of a hand. Accurate and realistic lighting conditions can be generated in the space, so it can be viewed for example in daytime, nighttime and in possible emergency lighting. Windows can show an animated scene, like ocean waves, and since many high-quality VR -headsets contain also headphones, the immersive experience can be deepened with sounds. The image quality on these headsets is pleasantly high so headaches can be forgotten. There are already headsets so precise on the market, that pixels are not seen at all.
Creating added value with simulation and embedded information
A VR scene should be thought of as a separate, engineered world of its own. When the visualization team happens to be backed up by a complete spectrum of marine design professionals such as in Elomatic, the world can resemble our physical world quite accurately. For example, walls are not only surfaces without any material thickness, but actually have the correct thicknesses and layers. This way any details, like joints can be inspected, and that is a level of information very few can offer, and many are interested in. But still, the greatest benefit of the VR world is that it can also be augmented with information which is intangible in our physical world.
A great example of what is meant by this are simulations. Elomatic employs a technical analysis team capable of running various, completely accurate simulations, which in turn the visualization team can turn into easily understandable visual data, existing in the virtual world but representing the unseen reality. Let’s say an AC designer would like to see how the air flows in a passenger cabin. The fresh supply air coming in can be made as a visible gas cloud, presented by blue color. As it spreads in the room, it starts to turn orange before vanishing to the exhaust. The designer can literally stand in the middle of the simulation, and it can change dynamically when the space is manipulated, for example balcony door is opened.
One additional benefit to be mentioned is that virtual spaces can also have all kinds of incorporated infographics, which can show numeric and written information, usually in the form of a floating pop-up screen. it makes evaluations and comparisons that much easier. When considering a public space with a lot of loose furniture, the infographic can show for example the changes in price if certain furniture is changed to another. A key factor in passenger comfort are noise levels. All virtual material can be embedded with noise dampening attribute data, leading the computer to be able to calculate the decibel level in the space.
The beauty and true potential of VR is that it can be taken as far as wanted. When the initially low level of effort is increased, the information value of a virtual space can actually quite quickly overtake physical mock-ups in the end. This hopefully tips the scale so, that in the near future, the customers will obtain a clear understanding that VR services (and their professional providers) are the most user-friendly and convenient way to go.
Intelligent Engineering
Latest post
Green ammonia from Finland – a synergy of water, wind and land
Kirjoittanut By Jussi YlinenA derivative of green hydrogen produced from renewable energy, green ammonia has the potential to become a new source of energy and revenue for the Finnish national economy. It allows the country to break its...
Read more » Lue lisää »