Why is an engine not capable of withstanding the load at low RPM (point 1), but it is perfectly able to at higher RPM (point 2)? What changes?
Interesting question @jstat .
I would guess that this is due to the fact that the engine runs more efficiently at higher loads and closer to the NCR point. At low loads the combustion is not optimal. The main reasons is that air charging is not optimal (in the past there where issues with the operation of fuel valves at low loads but in current designs this is not the case anymore).
The above can be seen in a typical SFOC (gr/kwh)curve of a two stroke marine engine, where at lower loads the SFOC (gr/kwh) is higher.
- List item
Hi @Paraschos.Liadis, thanks for your reply!
I agree that operating the engine at lower loads is not optimal for many reasons: lower efficiency, bad thermal distribution, and yes a wider range of (efficient) operation has definitely been introduced in newer engines.
However, it still does not explain why the engine would “break” or whatever “overload limit” means, at point 1 (in orange) on the diagram.
After looking into this, I am pretty sure that it has to do with the principle of hydrodynamic lubrication.
For those not familiar with the principle, you may find some intuition in the image below:
If this is the case, the overload limit refers to the ability of the bearings to withstand the load. I can’t source this very well, but it seems that, at higher RPM, the oil wedge between the journals and the bearings becomes thicker. Hence, it is able to counteract greater piston firing loads without causing friction (and bearing failure soon after).
For example, in the load diagram, you can see that 90% mep is acceptable at 95% rpm, but not at 75% rpm.
As a follow-up question: The main bearings rotate and the hydrodynamic wedge is achieved fairly easily – but what happens at the crossheads, where the journal rotation oscillates? Perhaps the point of failure is at the crosshead rather than at the main bearing.
Would love to learn more about this from a specialist because it has been a long-time question of mine. There is very little information available online about how this principle applies specifically to marine two-stroke engines.
@jstat, hi, interesting question indeed.
Agreed with Paraschos -->"At low loads the combustion is not optimal. The main reasons is that air charging is not optimal "
As per MAN B&W definition of Torque Curve:
“This line represents the torque/ speed limit for continuous operation of the engine, which is mainly defined by the thermal load of the engine components.”
As we have seen already with the DLF and the upcoming AWC in MAN electronic engines, torque limits can be exceeded by the ECS for a short period of time.
Check the attached MAN circular, it’s very interesting.
The DLF itself works by calculating the available air in the engine cylinders before each combustion takes place. When the mass of air is known, the Engine Control System (ECS) calculates the maximum allowable amount of fuel that can be injected into the combustion chamber before reaching the minimum acceptable air excess ratio. After approximately 30 minutes of DLF operation, the engine components need to cool down and the engine gradually reduces the limits back to the normal fuel index limiter.
The AWC functionality increases the amount of fuel injected into the combustion chamber, which increases the torque output of the engine. The AWC function delays the fuel injection, which limits the peak pressure and temperature in the combustion chamber. Thermal loads of piston crown, exhaust valve, cylinder liner and cylinder cover are hereby reduced and kept below the temperatures attained at the SMCR.
The delayed fuel injection improves conditions in the combustion chamber but increases the SFOC by 3% to 4%, compared to when the engine operates on the traditional torque limit curve.