Jump to content

DCS Bios Servo Settings Help


Kenpilot
 Share

Recommended Posts

I've just started using DCS Bios and Arduino and I'm having a tough time dialing in the servo settings so that the gauges in my panels move with and show what the actual gauges in DCS are showing. Is there an actual method as to what numbers you put in the sketch command lines for the servo settings or do you literally have to just guess and keep putting in random numbers until your gauges show and move with the gauges in DCS? More specifically, I'm talking about the flaps gauge, hydraulic press and fuel quantity gauges. For example, the default values in the DCS Bios Control Reference Sketches are 544, 2400. What exactly do these numbers mean and again, is there an actual method to dial in the right numbers or is it a total guessing game and you have to keep changing them and going back in to DCS to see if you have the right numbers? This obviously will take forever. Thanks!

Windows 10

ASRock Z370 Extreme4 LGA 1151 (300 Series) MOBO

intel i7-8700k (Not overclocked)

16 GB Ram

EVGA GeForce GTX 108ti SC Black Edition

SSD

Trackir

Link to comment
Share on other sites

544 and 2400 are the number of microseconds (us) for the 0 and full travel positions.
The DCS servo commands take their numbers and "map" them to the servo range you've given, then send that out to the servo.
544 and 2400 are "safe defaults" that shouldn't overdrive most servos.

So, if the full range of the DCS numbers were 0 to 90, when the DCS number was zero, the servo would be set for 544us.  If it was90, the servo would be set for 2400 us.
What you need to do is determine the us numbers for your servos' 0 and full travel points, then put those into the servo commands. BUT, you have to be gauge-specific.

e.g. the flaps gauge only has 90 degrees of travel, so you need to figure out the 0 and 90 point us numbers and go from there.

Link to comment
Share on other sites

Posted (edited)
1 hour ago, No1sonuk said:

544 and 2400 are the number of microseconds (us) for the 0 and full travel positions.
The DCS servo commands take their numbers and "map" them to the servo range you've given, then send that out to the servo.
544 and 2400 are "safe defaults" that shouldn't overdrive most servos.

So, if the full range of the DCS numbers were 0 to 90, when the DCS number was zero, the servo would be set for 544us.  If it was90, the servo would be set for 2400 us.
What you need to do is determine the us numbers for your servos' 0 and full travel points, then put those into the servo commands. BUT, you have to be gauge-specific.

e.g. the flaps gauge only has 90 degrees of travel, so you need to figure out the 0 and 90 point us numbers and go from there.

Thanks for the response! That's my problem, how do I figure out the 0 and 90 point us numbers??? Do I just randomly guess two different numbers and check it? This will take forever. I have tried a couple different people's servo settings from their sketches and none of them have matched the gauge in DCS. I don't think I'm understanding the whole concept of the microseconds and the 0-90, etc. Maybe I'm doing it wrong but I look at it like this: A servo travels 180 degrees. The flap position indicator for the A-10 travels from the 3 0'clock position (0 degrees of flaps) to a full flaps (30 degrees) 6 0'clock position. To me that is the 90 degree position to the 180 degree position. I have found two numbers that make it go from the 0 flaps position ( 3 o'clock/90 degree position) to the 30 degrees flap position (6 o'clock/180 degree position) but when I select flaps 10 degrees down from 0, the gauge indicates  like 12 degrees flaps, not 10. Then when I select flaps 20 goes down from 10, the indicator goes to 30 degrees of flaps. Then when I retract them, they stop at totally different numbers. I'm completely lost. 

 

Shouldn't everybody's servo numbers be the same in the arduino sketch for the flap position indicator? Why does everyone have different numbers? 

 

Am I understanding the math correctly or is this not how servos work? (544-2400 microseconds = 180 degrees of travel). So if we take 2400 minus 544 = 1856 microseconds to travel 180 degrees. 1856 divided by 180 = 10.31 microseconds to travel 1 degree. So let's say the 0 degrees flap setting indication for my servo is 1300, then using the math I just used, the 30 degree flap setting (90 degrees of travel=927.9 microseconds) should be 2228. (1300 + 927.9). 


Edited by Kenpilot

Windows 10

ASRock Z370 Extreme4 LGA 1151 (300 Series) MOBO

intel i7-8700k (Not overclocked)

16 GB Ram

EVGA GeForce GTX 108ti SC Black Edition

SSD

Trackir

Link to comment
Share on other sites

1 hour ago, No1sonuk said:

544 and 2400 are the number of microseconds (us) for the 0 and full travel positions.
The DCS servo commands take their numbers and "map" them to the servo range you've given, then send that out to the servo.
544 and 2400 are "safe defaults" that shouldn't overdrive most servos.

So, if the full range of the DCS numbers were 0 to 90, when the DCS number was zero, the servo would be set for 544us.  If it was90, the servo would be set for 2400 us.
What you need to do is determine the us numbers for your servos' 0 and full travel points, then put those into the servo commands. BUT, you have to be gauge-specific.

e.g. the flaps gauge only has 90 degrees of travel, so you need to figure out the 0 and 90 point us numbers and go from there.

So I tried what you said and I figured out the 0 and 90 point us numbers for the flaps gauge. They are 1325 and 1770. That gives me the 90 degrees of travel from 0 degrees flap position indication to 30 degrees flap position. Now what? Cause right now with those numbers, If I select flaps to 10 degrees, it goes to 10 degrees, but when I select flaps 20 degrees, it goes all the way to flaps 30 degrees indication, when the flaps are actually at 20. I'm totally lost. 

Windows 10

ASRock Z370 Extreme4 LGA 1151 (300 Series) MOBO

intel i7-8700k (Not overclocked)

16 GB Ram

EVGA GeForce GTX 108ti SC Black Edition

SSD

Trackir

Link to comment
Share on other sites

10 minutes ago, agrasyuk said:

sounds about right. if you are trying the A-10 the first stop is actually 7 degrees. roughly times 3 will gets you the 20 full flap degree.

to resolve set your step one to ~7 

3 times what will get me the 20 full flap degree? 

Windows 10

ASRock Z370 Extreme4 LGA 1151 (300 Series) MOBO

intel i7-8700k (Not overclocked)

16 GB Ram

EVGA GeForce GTX 108ti SC Black Edition

SSD

Trackir

Link to comment
Share on other sites

On 5/21/2021 at 7:31 PM, Kenpilot said:

Thanks for the response! That's my problem, how do I figure out the 0 and 90 point us numbers??? Do I just randomly guess two different numbers and check it? This will take forever.

[snip]

 

Shouldn't everybody's servo numbers be the same in the arduino sketch for the flap position indicator? Why does everyone have different numbers? 

 


I made a device to speed up the process of figuring out the end points.
I just made a thread here:

 

As for "Shouldn't everybody's servo numbers be the same":
Yes...  Assuming they all used the exact same servo.
Different servos require subtly different times.

Link to comment
Share on other sites

 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...