
08-26-2011, 08:24 PM
|
 |
Approved Member
|
|
Join Date: Feb 2008
Posts: 1,552
|
|
Quote:
Originally Posted by Viper2000
The supercharger is driven by the engine.
If you reduce the power consumed by the supercharger then you increase the brake horsepower and reduce the SFC.
Supercharger power consumption is just W*Cp*deltaT, ie W*deltaH.
Supercharger isentropic efficiency is
deltaH[actual]/deltaH[isentropic]
In the case of the Merlin, this figure was about 70%.
For isentropic, adiabatic compression,
T2 = T1(P2/P1)^(gamma/(gamma-1))
Hence it's trivial to calculate the isentropic deltaT, and deltaH.
DeltaT and deltaH both get smaller if we reduce T1.
Injecting fuel upstream of the supercharger reduces the temperature by about 25 K due to the latent heat of evaporation of the fuel.
This reduces the temperature rise across the supercharger, which is equivalent to increasing its adiabatic efficiency.
Clearly this confers an advantage to engines which inject fuel upstream of the supercharger. Given the considerable difficulty associated with increasing the aerodynamic efficiency of compressors, this advantage is not insignificant.
Mixture distribution is going to be very good provided that the charge temperature is sufficiently high for complete evaporation to be ensured. This will basically always be the case at high powers because deltaT is 100 K or more; indeed intercooling & aftercooling start to become necessary once you've got a lot of supercharge.
These advantages vanish at low non-dimensional power settings. Cars spend most of their time at very low non-dimensional power settings, and therefore DI wins hands down most of the time, especially if you go for CI, in which case it's almost no-contest.
In the end, the nature of all engineering trade studies is that the devil is in the detail. The optimum is a strong function of engine size and duty cycle, and we just don't build the sort of highly supercharged, high power spark ignition engines for which single point injection is attractive these days.
To use an analogy, old amplifiers used valves and therefore tended to have large transformers & rectifiers to produce the high DC voltages which allowed them to function. Most modern amplifiers are solid state, and they don't need those high voltages.
This doesn't mean that high DC voltages aren't still a good idea for valve amplifiers; I've got a pair of hundred watt half stacks sat next to me which run in excess of 400 V DC and sound great. But probably 99% of modern amplifiers for domestic use are solid state and so if you just ask "are high voltages a good idea for amplifiers" then the short answer is "probably not".
|
Viper,
The basic premise you posted is entirely wrong for all practical purposes. Your math does not take into account the heat of the engine and heat transfer to the manifold.
The conclusion reached is incorrect when it comes to engines...
Quote:
Injecting fuel upstream of the supercharger reduces the temperature by about 25 K due to the latent heat of evaporation of the fuel.
|
Injecting fuel into the intake raises the charge temperature. Liquid fuel transfers and has more heat capacity than air. That means the fuel allows the charge to absorb more of the intake manifold's heat and the over all effect is the charge temperature is higher which is therefore less dense.
You can confirm this with a copy of:
V.L. Maleev, Internal-Combustion Engines: Theory and Design, 2nd ed. (New York: McGraw-Hill Book Company, Inc., 1945).
http://books.google.com/books/about/...d=fgvHHgAACAAJ
Quote:
So why does an IO-360 (fuel injected) have a higher peak power than a O-360 (carbureted)? The answer is that fuel injection reduces losses in the intake system. The first reason is that the venturi in the carburetor is another constriction in the flow, which manifests itself as a pressure drop in the intake manifold. This pressure drop is eliminated with a fuel injection system, thus allowing a higher pressure to reach the cylinders, and thus a larger amount of fuel/air charge to enter the cylinder.
The second reason is that the fuel/air charge is colder, and thus denser when it reaches the cylinder, again allowing a larger amount of fuel/air charge to enter the cylinder. Just like when you add carb heat, the density of the fuel/air charge is reduced when it is heated. So you're asking "Why would it be heated?" In some carbureted engines, the intake manifold is heated to assist distribution. Even without intake manifold heating, the intake manifold will be hotter than the ambient air simply because it is attached to the engine. Heat transfer studies have shown that the liquid fuel on the walls on the intake manifold increases the rate of heat transfer. (Ref 1) Thus, in a carbureted engine, the small drops of fuel in the fuel/air charge cause the charge to heat up more passing through the intake manifold than dry air would passing through the same intake manifold. Therefore, the density of the fuel/air charge is decreased, reducing the amount of charge entering the cylinder. Experiments have shown that volumetric efficiency may be increased by 10% by direct injection of the fuel into the cylinders. This also prevents loss of fuel because of valve overlap. Fuel injection into the intake port (just outside the intake valve) shows a smaller, but appreciable improvement. (Ref 1)
|
http://www.eaa1000.av.org/technicl/e...htm#References
Last edited by Crumpp; 08-26-2011 at 08:27 PM.
|