目标跟踪质心跟踪算法

  • Post author:
  • Post category:其他




质心跟踪算法实现过程

这里目前只给出了CPU版本,GPU版本的后续补上。

我在网上看的大部分质心跟踪算法不是正统的目标跟踪,而是在多目标跟踪中结合目标检测算法不同帧之间的相同目标做一个link。调研过程没有发现有相对比较完整的质心跟踪算法的实现,本文主要利用c++实现了质心跟踪算法,主要参考:

  1. https://wenku.baidu.com/view/ce958ce30a1c59eef8c75fbfc77da26924c59668.html
  2. https://wenku.baidu.com/view/04783361b9d528ea80c7791c.html?fr=search

质心跟踪算法主要包括4个步骤:

  1. 帧间波门内差分;
  2. 波门内阈值分割;
  3. 波门内去噪声;
  4. 计算质心在波门内的位置;


帧间波门内差分

相邻帧之间在波门内进行差分操作。为了仅体现相邻帧之间的变化量,这里我做了绝对值差分操作。即





D

i

f

f

x

,

y

=

f

x

,

y

i

+

1

f

x

,

y

i

Diff_{x,y} = |f^{i+1}_{x,y} – f^i_{x,y}|






D


i


f



f











x


,


y





















=












f











x


,


y










i


+


1































f











x


,


y









i





























f

i

+

1

f^{i+1}







f











i


+


1













和f^i是相邻帧的波门框定图像。x, y表示具体坐标。通过绝对差分可以表征帧间目标的变化量。具体函数如下:

cv::absdiff(gray_image(_roi), this->gray_image(_roi), diff);


波门内阈值分割

在得到差分图像以后,有可能差分图像本身的对比度较低。我们需要对差分图像进行二值化来进一步体现目标的变化量。一般的图像二值化可以固定一个阈值比如127u,但是差分图像需要自适应地选择阈值来进行分割。此处选择最大类间方差算法来选择分割阈值。具体思想可以参见:https://blog.csdn.net/zhu_hongji/article/details/80967776

int CentroidTracker::getThresholdValue(cv::Mat& image);

选择了最优阈值之后,就对图像进行二值分割。

void CentroidTracker::segmentByThreshold(cv::Mat& src, cv::Mat& dst, int threshold);


波门内去噪声

去噪声这一步是可选的,在实现里我采用腐蚀+膨胀的方法(被注释掉了),腐蚀是为了去除白噪点,膨胀是为了突出目标的变化量。结构元素3×3。


计算质心在波门内的位置

计算质心是针对阈值分割后的图像来的,其基本原理是:将像素值大小当做权重,统计像素位置的加权均值。公式如下:





x

ˉ

=

x

y

x

f

x

,

y

x

y

f

x

,

y

y

ˉ

=

x

y

y

f

x

,

y

x

y

f

x

,

y

\bar{x} = \frac{\sum_x\sum_y x f_{x,y}}{\sum_x\sum_y f_{x,y}} \\ \bar{y} = \frac{\sum_x\sum_y y f_{x,y}}{\sum_x\sum_y f_{x,y}} \\













x







ˉ










=































x
































y





















f











x


,


y











































x
































y




















x



f











x


,


y


















































y







ˉ



















=































x
































y





















f











x


,


y











































x
































y




















y



f











x


,


y










































cv::Point_<int> CentroidTracker::findCentroid(cv::Mat& src);

完整代码如下:

// centroid_tracker.h
#ifndef CENTROID_TRACKING_H_
#define CENTROID_TRACKING_H_

#include "trackers/tracker.h"

class CentroidTracker : CustomTracker // CustomTracker is a custome abstract class
{
public:
    CentroidTracker(bool debug = false);
    ~CentroidTracker();
    void init(cv::Mat image, const cv::Rect &roi);
    cv::Rect update(cv::Mat image);
    int getThresholdValue(cv::Mat &image);
    void segmentByThreshold(cv::Mat& src, cv::Mat& dst, int threshold);
    cv::Point_<int> findCentroid(cv::Mat& src); 
    cv::Mat getDiff(cv::Mat& gray_image);

protected:
    cv::Rect_<float> _roi;
    cv::Mat image;
    cv::Mat gray_image;
    bool DEBUG = false;
    cv::Point_<int> ccentroid;
    int ite_num;
    bool first_ite;
};

#endif
// centroid_tracker.cpp

#include "centroid_tracker.h"

CentroidTracker::CentroidTracker(bool debug)
{
    this->DEBUG = debug;
}

CentroidTracker::~CentroidTracker() {}

void CentroidTracker::init(cv::Mat image, const cv::Rect &roi)
{
    this->_roi = roi;
    this->image = image;
    cv::cvtColor(image, this->gray_image, CV_BGR2GRAY);
    ite_num = 0;
    first_ite = true;
}

cv::Point_<int> CentroidTracker::findCentroid(cv::Mat& src)
{
    // 根据源图像的灰度值来计算质心
    assert(src.type() == CV_8UC1);
    int sum, xsum, ysum;
    sum = xsum = ysum = 0;
    for (int i = 0; i < src.rows; i++)
    {
        for (int j = 0; j < src.cols; j++)
        {
            int pix = src.at<uint8_t>(i, j);
            sum += pix;
            xsum += pix * j;
            ysum += pix * i;
        }
    }
    int x,y;
    if (sum != 0)
    {
        x = (int) ((float)xsum / sum);
        y = (int) ((float)ysum / sum);
    }
    else
    {
        x = src.cols / 2;
        y = src.rows / 2;
    }
    return cv::Point_<int>(x, y);
}

void CentroidTracker::segmentByThreshold(cv::Mat& src, cv::Mat& dst, int threshold)
{
    dst = cv::Mat(src.rows, src.cols, src.type(), cv::Scalar_<uint8_t>(0));

    for (int i = 0; i < src.rows; i++)
    {
        for (int j = 0; j < src.cols; j++)
        {
            if (src.at<uint8_t>(i, j) >= threshold)
            {
                dst.at<uint8_t>(i, j) = 255u;
            }
        }
    }
}

// 最大类间方差求解分割阈值
int CentroidTracker::getThresholdValue(cv::Mat& image)
{
    assert(image.type() == CV_8UC1);
    int width = image.cols;
    int height = image.rows;
    int x = 0, y = 0;
    int pixelCount[256] = {0};
    float pixelPro[256] = {0};
    int i, j;
    int threshold = 0;
    float pixelSum = width * height;
    //count every pixel number in whole image
    for (i = y; i < height; i++)
    {
        for (j = x; j < width; j++)
        {
            int current_pixel = image.at<uint8_t>(i, j);
            pixelCount[current_pixel]++;
        }
    }
    //count every pixel's radio in whole image pixel
    for (i = 0; i < 256; i++)
    {
        pixelPro[i] = pixelCount[i] / pixelSum;
    }
    // segmentation of the foreground and background
    // To traversal grayscale [0,255],and calculates the variance maximum
    //grayscale values for the best threshold value
    float w0, w1, u0tmp, u1tmp, u0, u1, u, deltaTmp, deltaMax = 0;
    for (i = 0; i < 256; i++)
    {
        w0 = w1 = u0tmp = u1tmp = u0 = u1 = u = deltaTmp = 0;
        for (j = 0; j < 256; j++)
        {
            if (j <= i)
            //background
            {
                w0 += pixelPro[j];
                u0tmp += j * pixelPro[j];
            }
            else
            {
                //foreground
                w1 += pixelPro[j];
                u1tmp += j * pixelPro[j];
            }
        }
        u0 = u0tmp / w0;
        u1 = u1tmp / w1;
        u = u0tmp + u1tmp;
        //Calculating the variance
        deltaTmp = w0 * (u0 - u) * (u0 - u) + w1 * (u1 - u) * (u1 - u);
        if (deltaTmp > deltaMax)
        {
            deltaMax = deltaTmp;
            threshold = i;
        }
    }

    //return the best threshold;
    return threshold;
}
cv::Mat CentroidTracker::getDiff(cv::Mat& gray_image)
{
    cv::Mat diff; // 获得差分图
    // 绝对值,代表变化量
    cv::absdiff(gray_image(_roi), this->gray_image(_roi), diff);
    return diff;
}
cv::Rect CentroidTracker::update(cv::Mat image)
{
    // 转化为灰度图
    cv::Mat gray_image;
    cv::cvtColor(image, gray_image, CV_BGR2GRAY);

    // 波门内差分运算
    cv::Mat diff = getDiff(gray_image);

    // 波门内阈值分割
    int threshold = getThresholdValue(diff);
    cv::Mat seg_image, seg_image_erode, seg_image_dilate;
    segmentByThreshold(diff, seg_image, threshold);
    // 波门内遍历去噪声 实际上是一个腐蚀操作
    // cv::Mat struct_ele = cv::getStructuringElement(cv::MORPH_RECT, cv::Size(3, 3), cv::Point(-1, -1)); 
    // cv::erode(seg_image, seg_image_erode, struct_ele);  
    // cv::dilate(seg_image, seg_image_dilate, struct_ele);
    // seg_image = seg_image_dilate;

    // 找质心
    cv::Point_<int> centroid = findCentroid(seg_image);
    if (first_ite)
    {
        ccentroid = centroid;
        first_ite = false;
    }
    // adjust roi by the absolute center's location
    int cent_x = _roi.x + (centroid.x + ccentroid.x) / 2;
    int cent_y = _roi.y + (centroid.y + ccentroid.y) / 2;
    _roi.x = int(cent_x - _roi.width / 2);
    _roi.y = int(cent_y - _roi.height / 2);
    
    _roi.x = _roi.x < 0 ? 0 : _roi.x;
    _roi.x = _roi.x > image.cols ? image.cols : _roi.x;
    _roi.y = _roi.y < 0 ? 0 : _roi.y;
    _roi.y = _roi.y > image.rows ? image.rows : _roi.y;
    ccentroid = centroid;
    this->image = image;
    this->gray_image = gray_image;
    
    ite_num++;

    if (this->DEBUG)
    {
        std::cout << "Centroid Location: " << centroid.x << ", " << centroid.y << std::endl;
        std::cout << "Centroid absolute location: " << cent_x << ", " << cent_y << std::endl;
        cv::imshow("DEBUG_DIFFIMAGE", diff);
        cv::imshow("DEBUG_SEGIMAGE", seg_image);
        // cv::imshow("DEBUG_SEGIMAGE_ERODE", seg_image_erode);
        cv::Mat cop_image;
        image.copyTo(cop_image);
        cv::circle(cop_image, cv::Point_<int>(cent_x, cent_y), 3, cv::Scalar_<int>(0,255,0));
        cv::rectangle(cop_image, cv::Point_<int>(_roi.x, _roi.y), cv::Point_<int>(_roi.x+ _roi.width, _roi.y+_roi.height), cv::Scalar_<int>(0,255,0));
        cv::imshow("Centroid && bbox", cop_image);
        cv::waitKey(0);
    }
    return _roi;
}



版权声明:本文为wuyanxue原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。