liaorili
2009-01-18, 11:45
create a MATLAB program to
1. generate a set of floating point ones and zeros. (each 1 or each 0 is 20 points long in time).
2. generate another set of such data with a time delay.
3. filter the first set of points to represent the radat wave.
4. multiply (point by point) the filtered waveform and the delayed unfiltered waveform
5. Low pass filter the final waveform
6. plot the original waveforms as a function of time.
7. plot the output of the low pass filter as a function of delay time.
1. generate a set of floating point ones and zeros. (each 1 or each 0 is 20 points long in time).
2. generate another set of such data with a time delay.
3. filter the first set of points to represent the radat wave.
4. multiply (point by point) the filtered waveform and the delayed unfiltered waveform
5. Low pass filter the final waveform
6. plot the original waveforms as a function of time.
7. plot the output of the low pass filter as a function of delay time.