gps卡尔曼滤波 matlab,opencv – 如何使用卡尔曼滤波器来预测介于两者之间的gps位置...
我研究了OpenCV卡尔曼滤波器实现并完成了一些基本的鼠标指针仿真并理解了基本知识.我似乎错过了在我的应用程序中使用它的一些关键点,并希望这里有人可以提供一个小例子.
使用具有速度和位置的简单模型:
KF.statePre.at(0) = mouse_info.x;
KF.statePre.at(1) = mouse_info.y;
KF.statePre.at(2) = 0;
KF.statePre.at(3) = 0;
KF.transitionMatrix = (Mat_(4, 4) << 1, 0, 1, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1);
setIdentity(KF.measurementMatrix);
setIdentity(KF.processNoiseCov, Scalar::all(1e-2));
setIdentity(KF.measurementNoiseCov, Scalar::all(1e-1));
setIdentity(KF.errorCovPost, Scalar::all(.1));
我可以做一个预测
Mat prediction = KF.predict();
我可以做一个纠正
Mat estimated = KF.correct(measurement);
在循环中这样做我不完全理解预测,估计和测量的含义.
因为衡量是一个“真实”的价值,可以用一些公平来衡量.让我们说GPS纬度和经度.
我认为这个视频展示了一些有趣的想法. https://www.youtube.com/watch?v=EQD0PH09Jvo.它使用GPS测量单位,以1hz更新,然后使用卡尔曼滤波器以10赫兹的速率预测值.
这样的设置怎么样?以下示例是如何做到的?
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat estimated = KF.correct(measurement);
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat prediction = KF.predict();
Mat estimated = KF.correct(measurement);
我不确定是否可以预测9个值,然后第10个提供测量值.
我是一些我想测试的记录数据.
记录的数据是文件中1hz的gps数据,其中每行是:timestamp:lat:lng并且在一个单独的文件中以15hz记录一系列事件:timestamp:eventdata.
我想使用卡尔曼滤波器模拟数据收集并根据1hz测量结果预测所有eventdata时间戳的位置.因为用opencv这样做的一个完整的例子会很好,但只有上面的问题的起始指针有预测和正确也是可以接受的.
更新
我试了一下.
int main()
{
char * log = "C:/dev/sinnovations/opencv_examples/kalman_tracking/data/geosamples.txt";
char * log1 = "C:/dev/sinnovations/opencv_examples/kalman_tracking/data/geosamples1.txt";
ifstream myReadFile(log);
ofstream myfile(log1);
myfile.precision(15);
KalmanFilter KF(4, 2, 0,CV_64F);
Mat_ state(4, 1);
Mat_ processNoise(4, 1, CV_64F);
Mat_ measurement(2, 1); measurement.setTo(Scalar(0));
KF.statePre.at(0) = 0;
KF.statePre.at(1) = 0;
KF.statePre.at(2) = 0;
KF.statePre.at(3) = 0;
KF.transitionMatrix = (Mat_(4, 4) << 1, 0, 1, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1); // Including velocity
KF.processNoiseCov = (cv::Mat_(4, 4) << 0.2, 0, 0.2, 0, 0, 0.2, 0, 0.2, 0, 0, 0.3, 0, 0, 0, 0, 0.3);
setIdentity(KF.measurementMatrix);
setIdentity(KF.processNoiseCov, Scalar::all(1e-4));
setIdentity(KF.measurementNoiseCov, Scalar::all(1e-1));
setIdentity(KF.errorCovPost, Scalar::all(.1));
std::vector < cv::Point3d > data;
while (!myReadFile.eof())
{
double ms;
double lat, lng, speed;
myReadFile >> ms;
myReadFile >> lat;
myReadFile >> lng;
myReadFile >> speed;
data.push_back(cv::Point3d(ms, lat, lng));
}
std::cout << data.size() << std::endl;
for (int i = 1, ii = data.size(); i < ii; ++i)
{
const cv::Point3d & last = data[i-1];
const cv::Point3d & current = data[i];
double steps = current.x - last.x;
std::cout << "Time between Points:" << current.x - last.x << endl;
std::cout << "Measurement:" << current << endl;
measurement(0) = last.y;
measurement(1) = last.z;
Mat estimated = KF.correct(measurement);
std::cout << "Estimated: " << estimated.t() << endl;
Mat prediction = KF.predict();
for (int j = 0; j < steps; j+=100){
prediction = KF.predict();
myfile << (long)((last.x - data[0].x) + j) << " " << prediction.at(0) << " " << prediction.at(1) << endl;
}
std::cout << "Prediction: " << prediction.t() << endl << endl;
}
return 0;
}
但是缺少了一些东西,因为结果可以在图片中看到.红色圆圈是记录值,蓝色线是lat lng值的第一个坐标的预测值:
我不认为我处理观察时间值和预测值的方式是正确的.