StackDoc

StackDoc

當前位置: 主頁 > IT新聞 > 行業新聞 >

使用微軟Kinect進行手勢識別操作的一個簡單範例(WPF-C#)

時間:2011-07-15 15:23來源:Internet 作者:Internet 點擊:
微軟在2009年6月2日的E3大展上。正式公布的XBOX360體感周邊外設。Natal彻底顛覆了遊戲的單一操作。使人機互動的理念更加彻底的展現出來。 它是一種3D體感攝影機(開發代號“Project

微軟在2009年6月2日的E3大展上。正式公布的XBOX360體感周邊外設。Natal彻底顛覆了遊戲的單一操作。使人機互動的理念更加彻底的展現出來。 它是一種3D體感攝影機(開發代號“Project Natal”),同時它導入了即時動態捕捉、影像辨識、麥克風輸入、語音辨識、社群互動等功能。微軟的Natal不需要使用任何控制器;它是依靠相機捕捉三維空間中玩家的運動。Kinect 的出現不僅帶來了一種新的遊戲操控體驗,由之引發的讓機器“讀懂人”的交互方式,也正在引領人機交互技術的新一輪變革。

 

微軟在今年的6月16日正式發布了Kinect for Windows SDK beta, 讓開發人員可以在Windows上開發出和XBOX上一样的互動程序。

 

Kinect for Windows SDK的下載地址:

http://research.microsoft.com/en-us/um/redmond/projects/kinectsdk/download.aspx

Kinect SDK目前只支持Windows 7,分为x86和x64兩個版本。開發工具方面還需要.NET Framework 4.0和Visual Studio 2010 (最低Express版本)的支持。

Kinect SDK的視頻開發教程:

http://channel9.msdn.com/Series/KinectSDKQuickstarts?sort=recent#tab_sortBy_recent

Kinect SDK的開發指南:

http://research.microsoft.com/en-us/um/redmond/projects/kinectsdk/guides.aspx

Kinect SDK的官方論壇:

http://social.msdn.microsoft.com/Forums/en-US/kinectsdk/threads

硬件設備的需求是:Kinect for Xbox 360 sensor和Xbox 360 Kinect AC Adapter/ Power Supply。

 

本教程给大家示範一個簡單的雙手在平面操作的範例:

首先請大家下載,並安裝Kinect for Windows SDK

 

用Visual Studio 2010 新建一個WPF工程

为了使代碼清晰易讀,我們先創建一個類庫(KinectLib):

首先要添加如下引用:

Microsoft.Research.Kinect.dll (C:\Program Files\Microsoft Research KinectSDK\Microsoft.Research.Kinect.dll)

PresentationFramework

WindowsBase

把類名稱改成:KinectSensor 整個類庫代碼如下:

 

using System.Windows.Controls;
using Microsoft.Research.Kinect.Nui;
using System;
using System.Windows;

namespace KinectLib
{
    //傳遞正在移動的點的坐標,
    public delegate void receivePoints(int jointID,double x, double y);
    public class KinectSensor : IDisposable
    {
        public event receivePoints _receivepoints;
        Runtime nui;
        const int RED_IDX = 2;
        const int GREEN_IDX = 1;
        const int BLUE_IDX = 0;
        byte[] depthFrame32 = new byte[320 * 240 * 4];
       
        public KinectSensor()
        {
            #region iniKinect
            nui = new Runtime();

            try
            {
                nui.Initialize(RuntimeOptions.UseDepthAndPlayerIndex | RuntimeOptions.UseSkeletalTracking | RuntimeOptions.UseColor);
            }
            catch (InvalidOperationException)
            {
                System.Windows.MessageBox.Show("未檢測到攝像頭設備!");
                return;
            }
            nui.SkeletonFrameReady += new EventHandler<SkeletonFrameReadyEventArgs>(nui_SkeletonFrameReady);
        }
        void nui_SkeletonFrameReady(object sender, SkeletonFrameReadyEventArgs e)
        {
            SkeletonFrame skeletonFrame = e.SkeletonFrame;
            foreach (SkeletonData data in skeletonFrame.Skeletons)
            {
                if (SkeletonTrackingState.Tracked == data.TrackingState)
                {
                    foreach (Joint joint in data.Joints)
                    {
                        Point jointPos = getDisplayPosition(joint);
                        _receivepoints((int)joint.ID,jointPos.X, jointPos.Y);
                    }
                }
            }
        }
        private Point getDisplayPosition(Joint joint)
        {
            float depthX, depthY;
            nui.SkeletonEngine.SkeletonToDepthImage(joint.Position, out depthX, out depthY);
            depthX = Math.Max(0, Math.Min(depthX * 320, 320));  //convert to 320, 240 space
            depthY = Math.Max(0, Math.Min(depthY * 240, 240));  //convert to 320, 240 space
            int colorX, colorY;
            ImageViewArea iv = new ImageViewArea();
            // only ImageResolution.Resolution640x480 is supported at this point
            nui.NuiCamera.GetColorPixelCoordinatesFromDepthPixel(ImageResolution.Resolution640x480, iv, (int)depthX, (int)depthY, (short)0, out colorX, out colorY);

            // map back to skeleton.Width & skeleton.Height
            //116x87
            return new Point((int)(116 * colorX / 640.0), (int)(87 * colorY / 480));
        }
       #endregion
        public void Dispose()
        {
            nui.Uninitialize();
            Environment.Exit(0);
        }
    }
}
   

Kinect 有3個攝像頭,其中一個是景深攝像頭,我們只讀取分析這個攝像頭的數據就足夠完成手勢的捕捉。

有了這個類庫,我們可以很方便的在主窗體調用了:

using KinectLib;

public partial class MainWindow : Window
    {
        KinectSensor ks;
        public MainWindow()
        {
            InitializeComponent();
            Loaded += new RoutedEventHandler(MainWindow_Loaded);
        }

        void MainWindow_Loaded(object sender, RoutedEventArgs e)
        {
            ks = new KinectSensor();
            ks._receivepoints += new receivePoints(ks__receivepoints);
        }

        void ks__receivepoints(int jointID, double x, double y)
        {          
            //    HandLeft = 7,//左手         
            //    HandRight = 11,//右手
            double x1 = x * 8.8;//把取得的位置放大8.8倍=1024
            double y1 = y * 8.8;//把取得的位置放大8.8倍=768

            switch (jointID)
            {
                case 7://左手
                    Canvas.SetLeft(this.leftHand, x1);
                    Canvas.SetTop(this.leftHand, y1);
                    break;
                case 11://右手
                    Canvas.SetLeft(this.rightHand, x1);
                    Canvas.SetTop(this.rightHand, y1);
                    break;
            }


        }

        private void MainWin_Unloaded(object sender, RoutedEventArgs e)
        {
            if (ks is IDisposable)//注銷Kinect
            {
                ks.Dispose();
            }
        }

 

 

 

本教程完整代碼下載

 

 


From:http://blog.csdn.net/soft2buy/article/details/6589326
頂一下
(0)
0%
踩一下
(0)
0%
------分隔線----------------------------
發表評論
請自覺遵守互聯網相關的政策法規,嚴禁發布色情、暴力、反動的言論。
評價:
表情:
驗證碼:點擊我更換圖片
欄目列表
推薦內容