Jump to content

Recommended Posts

Hi everyone,

 

I am wondering if this is a feature that doesn't exist in RL or that I have just not yet discovered...

The question is the following : when you release an ordonance on one wing only, the plane center of gravity moves on the opposite and it tends to roll, something that you can correct by manually triming the plane.

 

I am just surprised that an automatic triming function would not exist in such a sophisticated plane... It sounds raisonable to me that the rolling rate could be analysed by the computer and trimed acordingly after pilot action.

 

Did I miss something ?

 

Thanks

Link to post
Share on other sites

No, you didn't miss it.

Technically, I think it could be done. Heck... they can turn it into a drone:biggrin:  No need for a driver.

 

Cougar, you the guy from the other sim, aren't you? I remember that callsign.

Link to post
Share on other sites
Posted (edited)
6 minutes ago, Gripes323 said:

No, you didn't miss it.

Technically, I think it could be done. Heck... they can turn it into a drone:biggrin:  No need for a driver.

 

Cougar, you the guy from the other sim, aren't you? I remember that callsign.

Hi Gripes323,

 

OK thanks. That would be useful but as you said driver needed 😇

 

Which other sim are you speaking about ?


Edited by CougarFFW04
Link to post
Share on other sites

Might have been considered and not pursued due to cost or time. Thing you need to remember, is that in the early days of the Hornet, it was all about dumb iron bombs. Carry a bunch, and drop them all in one or two seconds. Remember, the Hornet entered service in 1982.

 

They had a lot to develop, and it was the first of the really true multi-role jets that could do both air to air and effective bombing, just by pushing a button to change roles. Yes, it's true the F-16 was pretty much the same. But see, neither of those jets really thought they'd be dropping a heavy ordnance on one side and then fly around with that imballance for another minute or 3 hours, because in those days it was really just the F-111 that was using LGB's for special purpose strikes.

 

It was a little bit of a surprise to see by the mid-80's two seater Hornet feature a laser pod, and it seemed to be a special mission item for unusual tasks, rather than the start of the wave of the future. 

 

So if time and money is a factor in keeping the Navy happy, the last thing you want to do is spend time on a trim function you don't think will be used at all!  I'd be curious to see if the newer Rhinos, the E's and F's and whatnot, maybe those do have auto trim for doing that, since precision munitions were quite common before they even hatched the idea of a SuperBug to the Navy brass. Also, it was probably a super-cheap few lines of code for the FBW computers on the newer jet, but the Legacy Hornet, that probably was going to cost in hardware, programming, and extensive testing.


Edited by Rick50
Link to post
Share on other sites

The Hornet's FCS response to uncommanded roll rate is very limited compared to pitch rate and load factor changes. Roll Rate feedback is scheduled with air data and gained down quite low. This limits the amount flight control surface the system can use to counter act uncommaneded roll rates.  You can view the gains from this NASA paper.

 

image.png

 

 

image.png

 

When the Dynamic Pressure is = 500 PSF, the roll rate feedback command less than 5% of the roll rate.  So the system is only seeing 5% of the uncommaneded roll rate.

 

The FCS does try and dampen out  roll rate,  but It just can't help that much.

 

Below is an example from an simulation of an older version of the FCS.  It comes from here https://apps.dtic.mil/sti/pdfs/ADA144480.pdf.

 

We're looking at the ailerons response to a max roll rate command. Then, half way through the simulation, the roll command is reversed. It goes from full right to full left. Also, starting at .6 Seconds, the roll rate sensor is being ramped up in the direction of the roll. Starting at .6 seconds you can see the FCS trying to dampen out this increased roll rate, but the system can only command a few degrees of Ailerons to do this.

 

 

 

image.png

The same is true of the Stabs.

image.png

 

The Mac Air team that worked on the Hornet, had it's hands full just dealing with handling qualities and meeting the roll rate requirements. To get it right they had to resize the ailerons, conceive and implement a wing warping solution, stiffen the wings and completely redesign the FCS. 

image.png

Read more about those changes here.

http://aviationarchives.blogspot.com/2016/12/early-years-in-hornets-nestf-18-growing.html

And Here

http://aviationarchives.blogspot.com/2015/01/f-18-flying-qualities-development-report.html

 

As a side note I'd like to address to those flight contronl responses I posted above. I was hesitant to post them because, I don't how accurate they are to IRL. A: the report notes they are not verified against either the MAC air model or flight testing. B, There are also a few other things that me make me apprehensive about them.

 

Primarily it's that the pilot input is noted in LBS of force. Early versions of the Hornet used a force sensor. However this was replaced by a position sensor in the 6.X proms in late 1981. The roll rate report says it's modeling version 8.2.1 from August of 1982. In this PROM, pilot input for pitch and roll command should be inches. It is then processed via gradient functions and feed into the FCS in terms volts for pitch or degrees for roll.

image.pngimage.png

 

 

The author of this model should be converting from force to inches in their model. I just don't see where it's doing that in the code. 

image.png

This further aroused my suspicion. I then I did a cursory examination of the program to see if it was a mislabeled older version PROM. To verify that it was in fact 8.2.1 I looked at some of the air data feedback gains and those appear accurate for 8.2.1. Even the stick gradient functions are correct for 8.2.1,

image.png

This just looks like the authors are inputting double the maximum of roll command possible. Which begs the point, does this matter?

 

I'm not sure it does. Max surface deflection is limited by air data, angle of attack, G, ect.  And, those limitations are properly modeled in this program. Thus commanding twice the maximum input won't result in any excess aileron deflection. They hit their air data defined position limits before they hit max possible deflection. Which is why it should be fine to use this example to illustrate the limited roll rate feedback of the FCS. As the output from the stick gradient will just top out 20.  Then try and counteract the excess roll rate. 

 

I could go on, but this has become another monster post. I hope some of you've have found it entertaining / useful. If anyone is interested in the language this program is written in it's a real peach. 

http://www.bitsavers.org/pdf/mitchellAndGauthierAssoc/Advanced_Continuous_Simulation_Language_1981.pdf

 

 


Edited by Curly
It's The National Grammar Rodeo!
  • Like 3
  • Thanks 2
Link to post
Share on other sites
29 minutes ago, Curly said:

Primarily is that the pilot input is noted in LBS of force. Early versions of the Hornet used a force sensor.

Could that mean that the early Hornet-stick was intended to function similar to the F-16s stick? I.e. the stick had almost no movement; the input was measured in force instead of stick-travel.

 

 The planes are siblings, of a sort 😛 The younger, less extreme, half-brothers of the space shuttle 😛

 

Glad to see that they had to add some curvature to the roll-input 😛 I can feel less bad about having the curvature set to 40 in DCS 😛 

 


Edited by TimRobertsen
  • Like 1

First become an aviator, then become a terminator

Link to post
Share on other sites
6 hours ago, Curly said:

The Hornet's FCS response to uncommanded roll rate is very limited compared to pitch rate and load factor changes. Roll Rate feedback is scheduled with air data and gained down quite low. This limits the amount flight control surface the system can use to counter act uncommaneded roll rates.  You can view the gains from this NASA paper.

 

image.png

 

 

image.png

 

When the Dynamic Pressure is = 500 PSF, the roll rate feedback command less than 5% of the roll rate.  So the system is only seeing 5% of the uncommaneded roll rate.

 

The FCS does try and dampen out  roll rate,  but It just can't help that much.

 

Below is an example from an simulation of an older version of the FCS.  It comes from here https://apps.dtic.mil/sti/pdfs/ADA144480.pdf.

 

We're looking at the ailerons response to a max roll rate command. Then, half way through the simulation, the roll command is reversed. It goes from full right to full left. Also, starting at .6 Seconds, the roll rate sensor is being ramped up in the direction of the roll. Starting at .6 seconds you can see the FCS trying to dampen out this increased roll rate, but the system can only command a few degrees of Ailerons to do this.

 

 

 

image.png

The same is true of the Stabs.

image.png

 

The Mac Air team that worked on the Hornet, had it's hands full just dealing with handling qualities and meeting the roll rate requirements. To get it right they had to resize the ailerons, conceive and implement a wing warping solution, stiffen the wings and completely redesign the FCS. 

image.png

Read more about those changes here.

http://aviationarchives.blogspot.com/2016/12/early-years-in-hornets-nestf-18-growing.html

And Here

http://aviationarchives.blogspot.com/2015/01/f-18-flying-qualities-development-report.html

 

As a side note I'd like to address to those flight contronl responses I posted above. I was hesitant to post them because, I don't how accurate they are to IRL. A: the report notes they are not verified against either the MAC air model or flight testing. B, There are also a few other things that me make me apprehensive about them.

 

Primarily it's that the pilot input is noted in LBS of force. Early versions of the Hornet used a force sensor. However this was replaced by a position sensor in the 6.X proms in late 1981. The roll rate report says it's modeling version 8.2.1 from August of 1982. In this PROM, pilot input for pitch and roll command should be inches. It is then processed via gradient functions and feed into the FCS in terms volts for pitch or degrees for roll.

image.pngimage.png

 

 

The author of this model should be converting from force to inches in their model. I just don't see where it's doing that in the code. 

image.png

This further aroused my suspicion. I then I did a cursory examination of the program to see if it was a mislabeled older version PROM. To verify that it was in fact 8.2.1 I looked at some of the air data feedback gains and those appear accurate for 8.2.1. Even the stick gradient functions are correct for 8.2.1,

image.png

This just looks like the authors are inputting double the maximum of roll command possible. Which begs the point, does this matter?

 

I'm not sure it does. Max surface deflection is limited by air data, angle of attack, G, ect.  And, those limitations are properly modeled in this program. Thus commanding twice the maximum input won't result in any excess aileron deflection. They hit their air data defined position limits before they hit max possible deflection. Which is why it should be fine to use this example to illustrate the limited roll rate feedback of the FCS. As the output from the stick gradient will just top out 20.  Then try and counteract the excess roll rate. 

 

I could go on, but this has become another monster post. I hope some of you've have found it entertaining / useful. If anyone is interested in the language this program is written in it's a real peach. 

http://www.bitsavers.org/pdf/mitchellAndGauthierAssoc/Advanced_Continuous_Simulation_Language_1981.pdf

 

 

 

 

I digested most of this, hehe.

Still... I think it would probably be possible to implement roll trimming for level flight, not that manual trimming is a big problem.

Say before launch you do your trim for asymmetric loadout. After launch, with increasing a/s the system wouldn't require radical corrections, besides it's already biased before launch.

After dropping ordinance the corrections may be more pronounced... Well, not really a big deal.

In DCS, if I'm dropping JDAMs from high altitude and don't feel like trimming and re-trimming constantly I turn ATTH on till I'm done releasing one at a time. It helps a little. If it takes too long between releases, trim...

Link to post
Share on other sites
6 hours ago, TimRobertsen said:

...Glad to see that they had to add some curvature to the roll-input 😛 I can feel less bad about having the curvature set to 40 in DCS 😛 

 

 

 

I've never felt bad about curving the pitch and roll axes. You have to for inflight refueling, form flying, etc.  The only issue with pitch... if you dumb it down too much around the center then go and do some A/A gunnery, it takes a little effort to master the changing gain as you adjust your pull.

Link to post
Share on other sites
16 hours ago, TimRobertsen said:

Could that mean that the early Hornet-stick was intended to function similar to the F-16s stick? I.e. the stick had almost no movement; the input was measured in force instead of stick-travel.

 

 The planes are siblings, of a sort 😛 The younger, less extreme, half-brothers of the space shuttle 😛

 

Glad to see that they had to add some curvature to the roll-input 😛 I can feel less bad about having the curvature set to 40 in DCS 😛 

 

 

 

The F-18's FCS was more like an advancement of the F-15's. The F-15's FCS is based around a hydro mechanical system and a control augmentation system.  It has a moving stick with a force sensor.  The stick is physically connected to the control surfaces through a series of mechanical boxes designed to provide specific handling qualities. The F-15 and F-18 were designed to have similar handling qualities. That is a consistent amount of G per stick force. Just to clarify, the amount of G per stick force is different for each aircraft.

 

Hornet Stick Force per G.

image.png

 

F-15 Stick Force Per G

image.png

The force sensor on the F-15's stick was essentially a means of error correcting the mech flight controls to deliver the stick force per G schedule. While the mech system is geared to deliver this G schedule, sometimes it needs help or is overaggressive. Lets say the pilot pulls the stick back .025 meters, this take ~45 newtons of force.  This should result ~ 3g. However there are transients in the system.  This where the stick sensor comes into play. It detects the amount of force the pilot is exerting on stick. If the G command is different from the G sensed;  The CAS system then drive the stabilizers to achieve the commanded G schedule.  

 

This is all done through a series of analog systems. The air data scheduling is done through a series of mechanical monsters.       

image.pngimage.png

 

Instead of using a computer and an algorithm to compute the air data schedule like the F-18 does. The F-15 uses this parallelogram ratio changer to alter the amount of control surface deflection commanded based on air speed and altitude.

 

https://www.f15sim.com/images/F-15_Lateral_Control-4.jpg

 

Even the F-15's CAS system is essentially an analog fly by wire system.   All the signal processing being done by Op amps, resistors, ect. Which isn't a problem because the system works as intended for the most part. 

 

https://www.f15sim.com/images/F-15_Pitch_Control-1.jpg

 

 

The F-16's first FCS was also analog. It used a system of resistors and amps, ect like the f-15's CAS.

 

The Hornet took the concept of the F-15's FCS and digitized it. This would get rid of all the mechanical monsters and updating the FCS would be as easy as loading new software.  The initial design  used a moving stick with the force sensor attached, just like the F-15's. However the hydro mech input system was only to be used as backup. It was essentially deactivated unless the aircraft was operating in a degraded mode. The force sensor alone was now the input to the FCS. However there were teething problems. Using the force sensor with the stick never really worked well with the Hornet. 

 

In the early FCS they had to digitally filter out some of the stick dynamics. Since the stick moved and sensed force, the interplay of dynamics got complicated. Think of a sudden acceleration. This throws you back in the seat, causing you to accidentally apply more force to the stick, causing the craft to pitch up. The engineers recognized these type of effects and built in a lot of digital signal filters and transfer functions.  The result was complicated and computationally expensive.  Which led to a laggy system prone to pilot induced oscillations. This is discussed in Flight qualities development report I linked above.  Since the early system was so problematic, they just abandoned the force sensor concept. Though the did leave it on stick, empty and non-functioning. That is where the linage from the F-15 still remains.

 

re: the curves in the gradients: Note that the stick force per G is linear. The non linear stick gradients in FCS, are part of the way the FCS tries to linearize the response of the aircraft. This is not same reason that we put curves on our sticks in the sim. We do this be cause we want finer contronl of the aircraft. Which is reasonable given the small throw of a desktop joystick.

 

 

9 hours ago, Gripes323 said:

 

I digested most of this, hehe.

Still... I think it would probably be possible to implement roll trimming for level flight, not that manual trimming is a big problem.

Say before launch you do your trim for asymmetric loadout. After launch, with increasing a/s the system wouldn't require radical corrections, besides it's already biased before launch.

After dropping ordinance the corrections may be more pronounced... Well, not really a big deal.

In DCS, if I'm dropping JDAMs from high altitude and don't feel like trimming and re-trimming constantly I turn ATTH on till I'm done releasing one at a time. It helps a little. If it takes too long between releases, trim...

Flight testing... That's the limiting factor in updates. There is never enough time or money do everything.  When OFP 10.5.1 was being tested. What was believed to be a very simple change to the AOA probe failure logic, resulted in a serious amount of problems. Handling qualities in approach were dangerously degraded and a lot of time and money was spent trying to fix it. After all was said and done the only fix was to rem out the dual aoa failure logic. Read about it https://apps.dtic.mil/dtic/tr/fulltext/u2/a307768.pdf  

 

The roll controls are ubiquitous in the FCS. To realize an automatic roll control. You would either have to change gain schedule to make it more aggressive or implement an integral  / derivative feedback system. Either of which would require a complete validation of the entire envelope.  There is nothing easy or cheap about those changes.  The legacy only got the 10.7 OFP because they piggy backed on the Super.


Edited by Curly
  • Thanks 1
Link to post
Share on other sites

Excellent thread!:thumbup:

 

Newer fighters such as the Rafale and Typhoon feature auto trim on the roll and yaw axis.  I think the reason it may not have been done until then perhaps down to complexity and cost as Curly mentioned.

Link to post
Share on other sites
  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...