Jump to content

optically guided A2A missiles?


Recommended Posts

I think there is some misunderstanding here: when manufacturers today speak about electro-optics, they do not mean obsolete TV systems. They actually mean sensors that cover a very wide spectral range, from UV tot IR, mostly using a CCD. We had this discussion already about the Maverick missile, where people still seem to claim the older IIR-versions need to be inherently superior than the latest Electro-optic models, like the K variant. I dont think so.

 

There is little comparison between modern EO and old TV-seekers. The Python 5 seeker is just a very sophisticated sensor that covers UV over visible to IR. The computer generated fusion image has properties and information a TV-image doesn't have, and that older-generation, heat-signature oriented IR systems did not have either. A modern EO seeker is nothing else than the next generation of IIR: imaging yes, but not limited to the IR spectrum.

 

Claiming that spectral info beyond IR is of no use at all is difficult to maintain. It very often makes sense to have access to a wider spectrum.

 

Wider spectrum means more information. Of course this requires more computing power and there is always a trade-off between what you actually need and what you can calculate in time. Since processing evolved considerably, so does the meaningfull use of a broader spectrum. An IR signature, e.g., would not be your choice for a 3D perspective view.

 

The whole point is: If WE look at TV footage of a fighter launching flares in daylight, ANYONE of us can clearly identify the aircraft from the flares. Older generation heaters just can't. At night, it's another story: if the fighter doesn't use afterburner we would maybe only be able to see the flares, whereas the IR can still see hot parts of the aircraft. I guess today's WVR missiles just want to combine these skills?

[sIGPIC][/sIGPIC]

Link to comment
Share on other sites

I heard of something like that that is supposed to be use on later Typhoon models. I think it was some sort of advanced chaff made from a slightly different material. When released the internal jammer in the Typhoon will send a burst of noise jamming at a particular frequency that will bounce of the chaff cloud and generate a larger false target far from the aircraft's and the chaff cloud's position. I believe it was called jaff.

 

I read that back in 1999, it could have been complete hear say but thats what i gathered from it. Never heard if IR chaff before.

Link to comment
Share on other sites

Well, I can say this.

 

A flare to an IR seeker, is pretty much blinds it. Now in the visual, not so bad.

 

There is just as much benefit to visual, as there is to IR or UV. More data is always a good thing.

 

And you must also realize, that just because the system is "optical" doesnt mean it is for visual spectrum.

Link to comment
Share on other sites

No, more data is not always a good thing. As a matter of fact, TOO MUCH data is a very bad thing.

 

So no, the visual isn't useful; it's too much *bad* data.

 

A flare doesn't 'blind' an IR seeker, insomuch as it makes the flare tasier than the aircraft to the seeker; this is particularely true of reticle seekers.

 

FPA's are an entirely different matter where 'blinding' will be somewhat more usful than 'attracting' since the FPA will reject the flare on basis of (probably) comparing a set number of frames, but it's not infallible - so may as well deny it an image altogether.

 

By the way, anyone know what UV filters are used for with IR seekers? Anyone? Guess what flares radiate? ... so yeah, that little filter is one heckuva ECCM piece. UV+IR=flare.

[sIGPIC][/sIGPIC]

Reminder: SAM = Speed Bump :D

I used to play flight sims like you, but then I took a slammer to the knee - Yoda

Link to comment
Share on other sites

How is visual light "bad"? Provide something other than your opinion.

 

IR is limited. Its limited to a small set of spectrum. UV is limited, small spectrum. Visual has a larger spectrum. These can be advantages, but also provide less data.

 

To a CCD that is designed for a specific wavelength, it gives the same output. That is then converted to an image.

 

IR is good because it contrasts well. UV for the same reason. But because of the narrow band, it is easier to jam also.

 

And exactly how is more data bad?

 

If you think more data is bad, then how about we just take the RWR out of the jets. I mean, you know, too much data and all.

 

The more data, the more that can be compared. IR gets jammed, so what you have UV and visual. UV gets jammed, so what got IR and visual. Throw up smoke, and who cares got UV, maybe IR.

Link to comment
Share on other sites

How is visual light "bad"? Provide something other than your opinion.

 

My 'opinion' comes from researching weapon guidance AI, facial recognition, and visual alarm systems. No, I'm not a PhD, just an amateur, but I know what I'm talking about.

 

I'll give you an example of why visual is not so good: The moment you have any lighting change, the processing algos have a very hard time coping with it. It literally looks like something completely different to the computer just by rolling and making part of itself dark or lighter. Furthermore, all you need is a clever paint scheme to make the seeker tell you 'you're on crack, tehre ain't nothin' there'.

 

IR is limited. Its limited to a small set of spectrum. UV is limited, small spectrum. Visual has a larger spectrum. These can be advantages, but also provide less data.

 

Exactly. You WANT less data. You want the LARGEST amount of USEFUL data that you can PROCESS in a reasonable time, and this time is very small. When you can naturally throw out all the confusing/computationally intensive/whatever stuff, why not do it? Incidentally this is why cruise missiles use optics with MITL guidance and radar when MITL is not available - the radar image helps the missile stay on course and avoid obstacles without the confusion of trying to optically determine how far the target is etc. The missile compares a map already taken by radar satellite to the radar image it sees, and it works out well because this missile really doesn't have all that much to do and more importantly, it doesn't need to it 100th as fast as an AA missile.

 

Anotehr difference here is that the AA missile will be fired in a non-predetermines environment, while cruise missiles have their autopilots carefully set up and flight paths pre-planned to help them out, so they 'know' what they're doing - point is, all this reduces computation.

 

To a CCD that is designed for a specific wavelength, it gives the same output. That is then converted to an image.

 

IR is good because it contrasts well. UV for the same reason. But because of the narrow band, it is easier to jam also.

 

Sure, you have to live with that; but there are better ways of rejecting CM's now,a dn CM's must once more catch up to seeker technology as is. The fact that you track an image alone currently is pretty big in terms of rejecting CM's, whereas before all you could track is a 'heat stop' and that in a manner

which was pretty susceptible to decoys. Google reticle seekrs, you might find something that'll explain that one well.

The CCD also tracks a 'heat spot' however it can probably process the image a few times a second and compared it to the last 2-3-4-5 frames and create a 'blob' which it can then compare against the previous images to ensure that its not tracking a flare. Because the flare will be fairly spherical in appearance on IR, it will likely be given a lower rating for tracking purposes as opposed tot eh aircraft blob, but this too depends on the range you're shooting at.

 

And exactly how is more data bad?

 

If you think more data is bad, then how about we just take the RWR out of the jets. I mean, you know, too much data and all.

 

Well that's a pretty silly argument. Even the RAW has to sift through and filter out stuff it doesn't want to 'see' (Wanna see an FM station on your RAW? No? how about solar flares? HAM radio? Satellite transmissions? Pictures from the Mars rovers?).

The RWR is totally and utterly irrelevant to any discussion about visualtracking of anything at this time.

See above to see how more data is bad, but I'll expound:

 

Optical tracking is CONFUSING. I have't seen a visual tracking algo YET that can cope even with simple environmental lighting changes, let alone a jet rolling, turning, and going from clear to cloudy sky or dark earth background etc. It's a /bad/ deal all around. Period.

 

The more data, the more that can be compared. IR gets jammed, so what you have UV and visual. UV gets jammed, so what got IR and visual. Throw up smoke, and who cares got UV, maybe IR.

 

No, all you need to defeat visual tracking is to turn tail, dump some fuel and roll out. The fuel will contrast nicely, the missile will lock onto it, and forget your aircraft ever existed.

IR can correct for this, but at this point, why were you burdening your missile with visual again? It's just an additional system which adds additional cost tot he weapon, additional demands on programming and processing power, and really doesn't have the return on investment that one would like it to. It's a step *back*, not forward. The art of tracking objects concerns itself with discarding irrelevant data -first- and ECCM second, since all the ECCM in the world won't help you when it just so happens that the nice low-vis paint scheme of an enemy aircraft against mroe or less the same color sky makes your missile go stupid. And there's the issue of night employment too.

 

Oh but wait! You can use IR! ... hm ... gee, I wonder why missile deisgners aren't fitting visual spectrum tracking devices to their AA missiles ... Hmmmm *rubs chin*

[sIGPIC][/sIGPIC]

Reminder: SAM = Speed Bump :D

I used to play flight sims like you, but then I took a slammer to the knee - Yoda

Link to comment
Share on other sites

I heard of something like that that is supposed to be use on later Typhoon models. I think it was some sort of advanced chaff made from a slightly different material. When released the internal jammer in the Typhoon will send a burst of noise jamming at a particular frequency that will bounce of the chaff cloud and generate a larger false target far from the aircraft's and the chaff cloud's position. I believe it was called jaff.

 

I read that back in 1999, it could have been complete hear say but thats what i gathered from it. Never heard if IR chaff before.

 

Close. It's actually called "CHILL". Stands for chaff illumination. Not an easy technique to pull off correctly.

  • Like 1
Link to comment
Share on other sites

Thanks for the long winded reply. I shoudnt have bothered reading most of it. Specially after the "I know what I am talking about". Well you wanna go bragging about what you know. I have a degree in Electronic Engineering. Not a PhD. I currently work at Intels main fab, on probably the most complicated piece of capital machinery made, I know what I am talking about. And it doesnt matter if its guidance, or your X-Box, its still just electronics. Visual, IR, UV, doesnt matter. The only difference is the CCD type used, and the optics material used.

 

The exact reason you want more data, is so it can be rejected.

 

If you just have a plain IR tracker, then it is easily jammed. Simple sensor, simple counter-measure. That is a fact.

 

To a CCD there is no difference between IR, UV or visual. It gets teh same signal, and the noise from CM is rejected the same way.

 

Patterns are rejected the same way. All it is, is more data, and its useful regardless, if yes, your opinion.

 

Did you ever think, like I said, that with visual acquisition if the target turned tail and ran, and dumped fuel to block the visual, that you still have IR? =O! So why bother with visual? Oh wait, arent there ways to counter IR, oh yes thats right, there is. Its called a flare. And a flare is easier to reject in a visual picture, than an IR.

 

You cant reject data, if you dont have it. You are right, the RWR had nothing do with it. Except for the fact that it was an ANALOGY. You say data is bad. i proved a point, that its not.

 

Just because you are a 1.2 beta tester dont mean your the only opinion that counts around here.

 

You talk about processing power. The processing isnt demanded by creating the image, its demanded by advanced algorithms to determine false positives.

 

Anyways, its clear you need to have you way in this conversation, so "yes dear, you are absolutely right".

  • Like 1
Link to comment
Share on other sites

Thanks for the long winded reply. I shoudnt have bothered reading most of it. Specially after the "I know what I am talking about". Well you wanna go bragging about what you know. I have a degree in Electronic Engineering. Not a PhD. I currently work at Intels main fab, on probably the most complicated piece of capital machinery made, I know what I am talking about. And it doesnt matter if its guidance, or your X-Box, its still just electronics. Visual, IR, UV, doesnt matter. The only difference is the CCD type used, and the optics material used.

 

Sure it matters; the quality of data is completely different.

You know what; I have a degree in Computational Mathematics which is much closer to the subject in terms of imaging seekers in any case. Please don't confuse building silicon with data processing. It's not 'just electronics', rather it's all about the algo's, as well as physical means of rejection (ie. rejecting certain wavelengths via optical filters)

 

 

The exact reason you want more data, is so it can be rejected.

 

Only if it does something useful for you. If you're going to reject it anyway, you may as well filter it out form the get-go.

 

If you just have a plain IR tracker, then it is easily jammed. Simple sensor, simple counter-measure. That is a fact.

 

What's a 'plain IR tracker'? Even reticle seekers today have sophisticated ECCM capabilities.

 

This is a relatively useful document, and since you have an EE degree you'll probably understand it even better than I do:

 

http://www.ausairpower.net/TE-IR-Guidance.html

 

To a CCD there is no difference between IR, UV or visual. It gets teh same signal, and the noise from CM is rejected the same way.

 

Patterns are rejected the same way. All it is, is more data, and its useful regardless, if yes, your opinion.

 

No, it's more data that you have to process with less narrowly defined parameters, which makes the algos chew at it longer. You could of course image in B&W or grayscale, but I fail to see the point when you already have perfectly good IIR tracking.

 

Oh and, while the 'signal' may be the same, the data isn't. Visual images introduce much more clutter and while you -can- probably remove it, you'll need more processing time to chew on it, and even then you're not safe from sudden lighting changes.

 

Did you ever think, like I said, that with visual acquisition if the target turned tail and ran, and dumped fuel to block the visual, that you still have IR? =O! So why bother with visual? Oh wait, arent there ways to counter IR, oh yes thats right, there is. Its called a flare. And a flare is easier to reject in a visual picture, than an IR.

 

Actually, no. It's easier to reject in IR because there are already established ways of doing so with reticle seekers -and- in addition with imaging you can now do more consistent spacial separation. An IR flare isn't going to blind an IIR seeker; that's not how it works. There are reasons why it works on reticle seekers, but it's not 'blinding'.

By the way, I'm sure that somewhere on Raytheon's site you can find product descriptions for their FPA's that mentions those devices are practically immune to IR pulse jammers, and the whole system is very flare-resistant, too, to the point where you need to develop new CM's against them.

 

You cant reject data, if you dont have it. You are right, the RWR had nothing do with it. Except for the fact that it was an ANALOGY. You say data is bad. i proved a point, that its not.

 

You didn't prove anything, rather you showed that you don't understand the problem because the RWR is a pretty flawed analogy in a lot of ways insofar as comparing it to an IIR seeker goes. The RWR classifies and determines the direction of radio signals; it has to do this pretty quickly, but you can devote quite a bit of hardware space and electrical power to it, too. Ever noticed how big the HTS pod is, by any chance? The RWR does a LOT of work, but it doesn't do image processing. And insofar as having data goes, it never even knows about radio stations existing for example - why even process data that's useless for what its trying to do? ;)

 

This is data that you'd reject anyway, so you may as well not see it. Useless data. Get it? But hey, that again is not a good analogy on my part to a visual seeker, because the data proced by a visual seeker can be potentnially useful for attacking a target, right?

 

So now you have to decide if it's worth processing that data.

 

Just because you are a 1.2 beta tester dont mean your the only opinion that counts around here.

 

I don't need to be a beta tester for my opinion to count. Where did I say 'hey look, I'm a Beta tester, respect my authority'?

 

You talk about processing power. The processing isnt demanded by creating the image, its demanded by advanced algorithms to determine false positives.

 

Right. That's not news to me. I've worked with computer vision for some time now.

 

Again, once you have an IR image (and by the way, what's this about UV seekers? I haven't seen a single missile advertized with one of those yet except as a filter used to reject flares...) everything else is extranneous.

You already have to spend time and power tracking THAT image and processing THAT image.

Throw in visual, and suddenly you need to use different image processing algoithms because of the potential radical changes in lighting and contrast - your data behaves differently, so it has to be treated differently.

Waste of computational resources and electricity, and it probably won't track any better anyway - you could reject the flare by contrast, but then again you can do so in the IR image as well, or just via the UV filter (at least until their start making flares that emit less UV).

At the same time, a longer ranged tail-on or head-on attack will have such low contrast that the missile will likely never lock onto the fighter, whereas the IIR seeker will snap it right up.

 

 

Anyways, its clear you need to have you way in this conversation, so "yes dear, you are absolutely right".

 

Have any useful arguments to offer, or do you need to look up 'ad hominem'?

[sIGPIC][/sIGPIC]

Reminder: SAM = Speed Bump :D

I used to play flight sims like you, but then I took a slammer to the knee - Yoda

Link to comment
Share on other sites

I am inclined to agree with Prophet here. The missile probably use IR sensor most of the time - as far as it see only one heat source. If aircraft pops a flare then the missile suddenly see two targets. If you have "visual spectrum sensor" then in THAT POINT ONLY you can simply overlay IR and "visual spectrum" images to determine which one is flare (you can use just single frame data for that). At most aircraft-to-missile angles the flare produces much more visible yellow-orange light than the aircraft so you can use visible spectrum to distinguish aircraft from flare. The missile does not have to process all "visible spectrum" pixels of the image - it can for example process only those pixels which overlay with "heat pixels" of the IR sensor. After this decision is made the missile can again switch to pure IR mode following the correct target.

 

Gharos, the fact that you don't see how "visible spectrum" sensor can be used does not mean that somebody else will not find a way how to efectively use it. And if it is not the algorithm I described above it doesn't matter - somebody might find different one ...

Link to comment
Share on other sites

I am inclined to agree with Prophet here. The missile probably use IR sensor most of the time - as far as it see only one heat source. If aircraft pops a flare then the missile suddenly see two targets. If you have "visual spectrum sensor" then in THAT POINT ONLY you can simply overlay IR and "visual spectrum" images to determine which one is flare (you can use just single frame data for that). At most aircraft-to-missile angles the flare produces much more visible yellow-orange light than the aircraft so you can use visible spectrum to distinguish aircraft from flare. The missile does not have to process all "visible spectrum" pixels of the image - it can for example process only those pixels which overlay with "heat pixels" of the IR sensor. After this decision is made the missile can again switch to pure IR mode following the correct target.

 

Gharos, the fact that you don't see how "visible spectrum" sensor can be used does not mean that somebody else will not find a way how to efectively use it. And if it is not the algorithm I described above it doesn't matter - somebody might find different one ...

 

Maybe my argument isn't clear enough ...

You'll reject the flare in the IR spectrum anyway (see, IIR FPA's can measure intensity. Flares are rather intense compared to an aircraft - you could call this 'color' ... every seen false color thermal images?)

 

So my question again ... why add the step of using the visual when it's not giving you additional capability, and is degraded very easily by weather?

 

These are pretty real seeker/target tracking considerations.

 

NOWHERE in any AF paper I've read is the visual spectrum even *CONSIDERED* for missile guidance. Of course, I don't have access to everything, so who knows - it may be out there.

 

The only visual spectrum seekers I've seen are used in AG weapons, and those are typically fraught with problems - some of those may have been resolved given that the Mav-K uses a vis-spectrum CCD, but then AGAIN the Mav-K is a BUDGET missile, and it's quite likely that using a visual spectrum CCD avoids the costs associated with the complex and expensive optical filters that are apparently fitted to IR and IIR missiles. It also avoids the need to cool the seeker, so it reduces maintainance.

 

Finally, it's a daytime weapon, and pretty obviously not 'all weather'; even rain will cut an optical system down to size compared to an IIR system.

 

Unfortunately for the AAM, it DOES require all-weather capability, and DOES require night capability, so the choice of IIR is painfully obvious and the use of a dual IIR/Visual system painfully extranneous.

[sIGPIC][/sIGPIC]

Reminder: SAM = Speed Bump :D

I used to play flight sims like you, but then I took a slammer to the knee - Yoda

Link to comment
Share on other sites

If you rejected the flare anyways, then flares wouldnt be used as CM's.

 

They work on reticle seeker type missiles which are -currently- most common and currently feature a variety of ECCM - some are easier to decoy than others.

 

Any missile using an FPA (Like the AIM-9X, the new stinger) is essentially immune if the manufacturer's claims are to be believed.

 

This is why currently new CM's are being developped to deal with this, things like self propelled flares, 'smart' counter-measures and all sorts of other neat stuff. Kinematic filtering I believe is present in reticle seekers already.

 

And before I forget, the IIR and IR missiles do indeed compare two (very narrow) different IR spectra to which the atmospehre is transparent; this 'two color' operation apparently helpds greately in rejection of flares since aircraft may well radiate more in one and not as much in the other, etc - additional thigns like high band pass filters also help (typically used to filter out the sun)

[sIGPIC][/sIGPIC]

Reminder: SAM = Speed Bump :D

I used to play flight sims like you, but then I took a slammer to the knee - Yoda

Link to comment
Share on other sites

UV has been brought up before. I dont know if they are used in A2A missles.

 

I do know that the TOW2 anti-armor missles has a IR lamp and a UV beacon on the back, to track the missle. Tracks good enough to 1650m, and thats at the edge of visual range against armor.

 

I have fired that weapon, and used the IR scope on it extensively.

 

There are multiple uses for visual spectrum. For instance, you may get intensity from IR. But what if the intensity of the flare matches that of the engine? Now suppose you had a visual spectrum sensor with it. The wavelength of the flare, will not be the same as the spectrum of the engine. All you need is a diffraction grating to see that.

 

There are more filter options for visual, since it is a larger spectrum. Look at only red, look at only green, or yellow or blue.

 

Red and blue make little sense, since they are close to IR and UV respectively, but in IR and UV you have a very narrow band of wavelength to discriminate. Visual is large. Sure, its seems easy to defeat, but you judge that by what you can see. Electrically, its the same though. Just because the image you see is blurred, doesnt mean the image the CCD needs to be.

 

Wavefront metrology. Its a method to remove atmoshperic distortion. They use it in Photolithopgrahy (me) and ground based stellar telescopes.

Link to comment
Share on other sites

And just for the record, I am not rejecting the fact that IR is a very very good way of target acquisition.

 

I am just arguing the point that visual is not.

 

http://www.dynetics.com/solutions/test/testHardware/seeker.htm

 

People are testing it. Also, IR visual UV and radar are all the same thing. Electromagnetic waves. UV and visual provide a better resolution of data since its a shorter wavelength.

 

Look up LADAR.

Link to comment
Share on other sites

I'm aware of the physics involved, but I'm sure we can agree that despite being 'the same thing', once you hop frequencies a good deal, the method of use and properties of data received are quite different from each method, which is pretty significant - after all different CM's are used for different guidance for one thing, right?

 

About your link; pretty neat, though I still think visual spectrum has no business in an AA-fight for the time being; I've laid out my reasons above so I won't rehash them. Let's just say that that's my line and I'm sticking to it :)

 

The UV range has me a little perplexed because of the specific properties of that spectrum, but I may rememebr some things about it wrong so I'll have to check ... as I said the only time I've heard of this one was a filter that helped discriminate flares from aircraft for reticle seekers (because the flares tended to burn magnesium which produces a lot of UV).

[sIGPIC][/sIGPIC]

Reminder: SAM = Speed Bump :D

I used to play flight sims like you, but then I took a slammer to the knee - Yoda

Link to comment
Share on other sites

Wow, the - interesting - discussion got somewhat overheated! (Always the problem with IR-sensors BTW).

 

But lets not get oversensitive. I agree that GGTharos has a valid point that IR is currently the standard for non-radar A2A, and for the reasons mentioned: IR imaging gives you basically the info you need.

 

Most of todays IR seeker FPA's are CCDs (CMOS still has some problems).

 

The point I and Prophet are making, is:

 

- that these CCD's are perfectly capable of sensing a broader spectrum than IR.

- that you could process these sensor data to obtain useful information, given the fenomenal gains in current computing power

- this information can be overlead, fused or combined with IR-data to compose a computer-generated "image", in fact a matrix of associated data

- this can be used by algorithms for better locking

- this can also be used for better rejection

 

I think Python 5 is a technology demonstrator for this.

 

On the MAV:

 

- I agree the CCD-input based locking mechanism is slower than that of the IR-version and more subject to clutter and bad weather

- However, current ROE often imply that you first visually identify a possible target, politely present yourself, give them a chance to leave and only after 10 phone calls and some interventions from ambassadors are allowed to fire a missile. So I guess in bad weather, even with IR, your missiles are homebound :=)

- I think ANY missile should be a "BUDGET" missile because only scalable, reproducible technology is sustainable: so modern CCD = lighter, less maintenance, higher reliability, more payload, from which I would deduce better weapon system.

- using a modern multispectral CCD offers growth potential that follows increases in computing power.

[sIGPIC][/sIGPIC]

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...