亚洲免费在线-亚洲免费在线播放-亚洲免费在线观看-亚洲免费在线观看视频-亚洲免费在线看-亚洲免费在线视频

Kinect SDK C++ - 2. Kinect Depth Data

系統 2630 0
Today we will learn how to get depth data from a kinect and what the format of the data is


kinect code


kinect Initialization


To get the depth data from the kinect, simply change the argument to NuiImageStreaOpen().


The First argument is now NUI_IMAGE_TYPE_DEPATH,telling the Kinect that wo now want depath images


instead of RGB iamges.(For clarity we also changed the name of the handle to reflect this)


We also should enable the Near Mode.let the kinect to be more sensitive to closer objects(say from 50cm to


200cm),otherwise,from 80 to 400cm.


To done that,passing flag NUI_IMAGE_FLAG_ENABLE_NEAR_MODE as the third argument

      <span style="font-size:12px;"><span style="font-size:10px;">// NEW VARIABLE
HANDLE depthStream;

bool initKinect() {
    // Get a working kinect sensor
    int numSensors;
    if (NuiGetSensorCount(&numSensors) < 0 || numSensors < 1) return false;
    if (NuiCreateSensorByIndex(0, &sensor) < 0) return false;

    // Initialize sensor
    sensor->NuiInitialize(NUI_INITIALIZE_FLAG_USES_DEPTH | NUI_INITIALIZE_FLAG_USES_COLOR);

        // --------------- START CHANGED CODE -----------------
    sensor->NuiImageStreamOpen(
        NUI_IMAGE_TYPE_DEPTH,                     // Depth camera or rgb camera?
      

NUI_IMAGE_RESOLUTION_640x480, // Image resolution NUI_IMAGE_STREAM_FLAG_ENABLE_NEAR_MODE, // Image stream flags, e.g. near mode 2, // Number of frames to buffer NULL, // Event handle &depthStream); // --------------- END CHANGED CODE ----------------- return sensor; }</span></span>



For more information about the near mode,please prefer to offficial blog.




getting a depth frame from the kinect


the display of the dapth image from the kinect in grayscale.Each pixel will just be the pixel's distance


from the kinect(in millimeters)mod 256.


note the NuiDepthPixelToDepth fuction,calling this function returns the depth in millimeters at that pixel.


The depth data is 16 bits,so we use a USHORT to read it in.

      <span style="font-size:12px;"><span style="font-size:10px;">      const USHORT* curr = (const USHORT*) LockedRect.pBits;
        const USHORT* dataEnd = curr + (width*height);

        while (curr < dataEnd) {
            // Get depth in millimeters
            USHORT depth = NuiDepthPixelToDepth(*curr++);

            // Draw a grayscale image of the depth:
            // B,G,R are all set to depth%256, alpha set to 1.
            for (int i = 0; i < 3; ++i)
                *dest++ = (BYTE) depth%256;
            *dest++ = 0xff;
        }
</span></span>
    


that's all the Kinect code! The rest is just how to get it to display.

版權聲明:本文博客原創文章。博客,未經同意,不得轉載。

Kinect SDK C++ - 2. Kinect Depth Data


更多文章、技術交流、商務合作、聯系博主

微信掃碼或搜索:z360901061

微信掃一掃加我為好友

QQ號聯系: 360901061

您的支持是博主寫作最大的動力,如果您喜歡我的文章,感覺我的文章對您有幫助,請用微信掃描下面二維碼支持博主2元、5元、10元、20元等您想捐的金額吧,狠狠點擊下面給點支持吧,站長非常感激您!手機微信長按不能支付解決辦法:請將微信支付二維碼保存到相冊,切換到微信,然后點擊微信右上角掃一掃功能,選擇支付二維碼完成支付。

【本文對您有幫助就好】

您的支持是博主寫作最大的動力,如果您喜歡我的文章,感覺我的文章對您有幫助,請用微信掃描上面二維碼支持博主2元、5元、10元、自定義金額等您想捐的金額吧,站長會非常 感謝您的哦!!!

發表我的評論
最新評論 總共0條評論
主站蜘蛛池模板: 国产69精品久久久久99 | 2019年国产高清情侣视频 | 四虎影视免费在线观看 | 亚州综合激情另类久久久 | 久久久久久亚洲精品影院 | 久久久99精品久久久久久 | 国内久久久久高清影视 | 亚洲v在线| 国产福利午夜波多野结衣 | 美女久久久久久久久久久 | 四虎影视国产精品永久在线 | 国产视频久久 | 一级毛片a免费播放王色 | 色一区二区 | 亚洲 欧美 综合 | 国产一区二区精品久久岳 | 国产精品香蕉在线一区 | 亚洲你懂的 | 久久久久久国产精品视频 | 国产亚洲精品久久久久久牛牛 | 精品伊人久久久99热这里只 | 最新国产精品好看的国产精品 | 精品偷拍模特露出丝袜在线 | 美女久久久久 | 午夜欧美精品久久久久久久 | 女人一级毛片免费观看 | 国内成人精品视频 | 97影院午夜在线观看视频 | 国内精品视频一区二区八戒 | 综合色好色 | 国产成人小视频 | 天堂伊人网 | 成人a毛片久久免费播放 | 超级毛片 | 97啪啪| 久久er热在这里只有精品85 | 中文字幕在线视频一区 | 国产成在线人视频免费视频 | 久久综合九色综合97_ 久久久 | 美女一级a毛片免费观看 | 性www|