{"id":499,"date":"2021-02-27T15:23:15","date_gmt":"2021-02-27T06:23:15","guid":{"rendered":"http:\/\/www.sawada.phys.waseda.ac.jp\/?page_id=499"},"modified":"2021-02-27T17:15:39","modified_gmt":"2021-02-27T08:15:39","slug":"499-2","status":"publish","type":"page","link":"https:\/\/www.sawada.phys.waseda.ac.jp\/?page_id=499","title":{"rendered":"2016\u5e74\u5ea6\u4ee5\u524d\u306e\u7814\u7a76\u696d\u7e3e"},"content":{"rendered":"<p class=\"style3\"><span class=\"style4\"><strong>2016\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/strong><br \/>\n<\/span>\u3000 [1] Junichi Danjo, Sonoko Danjo, Yu Nakamura, Keiji Uchida, and Hideyuki Sawada: \u201cMicro-Vibration Patterns Generated from Shape Memory Alloy Actuators and the Detection of an Asymptomatic Tactile Sensation Decrease in Diabetic Patients\u201d,<br \/>\nIEICE TRANSACTIONS on Information and Systems, \u96fb\u5b50\u60c5\u5831\u901a\u4fe1\u5b66\u4f1a, Vol. E99-D, No.11, pp.2759-2766, November 2016<br \/>\n[2] Vo Nhu Thanh and Hideyuki Sawada: \u201cA Talking Robot and Its Real-time Interactive Modification for Speech Clarification\u201d,<br \/>\nSICE Journal of Control, Measurement, and System Integration, \u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a, pp. 251-256, Vol.9, No.6, November 2016<br \/>\n[3] Takehiro Miki, Toshinori Iwai, Kazunori Kotani, Jianwu Dang, Hideyuki Sawada and Minoru Miyake: \u201cDevelopment of a virtual reality training system for endoscope-assisted submandibular gland removal\u201d,<br \/>\nJournal of Cranio-Maxillofacial Surgery, Volume 44, Issue 11, pp. 1800-1805,\u00a0<a href=\"http:\/\/dx.doi.org\/10.1016\/j.jcms.2016.08.018\">doi<\/a>, November 2016<br \/>\n[4] Vo Nhu Thanh and Hideyuki Sawada: \u201cAutomatic Vowel Sequence Reproduction for a Talking Robot Based on PARCOR Coefficient Template Matching\u201d,<br \/>\nIEIE Transactions on Smart Processing and Computing, Vol. 5, No. 3, pp.215-221, 2016<br \/>\n[5] Hideyuki Sawada, Keiji Uchida, Junichi Danjo and Yu Nakamura: \u201cDevelopment of a Non-invasive Screening Device of Diabetic Peripheral Neuropathy Based on the Perception of Micro-vibration\u201d,<br \/>\nIEEE International Conference on Computational Intelligence in Bioinformatics and Computational Biology 2016 (CIBCB2016), Chiang Mai, Thailand, pp. 1-6, 2016<br \/>\n[6] Vo Nhu Thanh and Hideyuki Sawada: \u201cComparison of Several Acoustic Features for the Vowel Sequence Reproduction of a Talking Robot\u201d,<br \/>\n2016 IEEE International Conference on Mechatronics and Automation (ICMA2016), Harbin, China, pp. 1137-1142, 2016<br \/>\n[8] Hideyuki Sawada, Shohei Kitano and Sho Yokota: \u201cA Hapto-tactile Display for Presenting Virtual Objects in Human-scale Tactual Search\u201d,<br \/>\nIEEE International Conference on Human System Interaction (HSI2016), Portsmouth, UK, pp. 372-377, 2016<br \/>\n[8]\u00a0<a href=\"http:\/\/www.springer.com\/la\/book\/9784431557715\">Pervasive Haptics<\/a>, Springer Japan, July 2016, ISBN 978-4-431-55772-2, \u201cTactile Display Using the Micro-vibration of Shape-Memory Alloy Wires and Its Application to Tactile Interaction Systems\u201d \u3092\u57f7\u7b46, pp. 57-77, 2016<br \/>\n[9] \u5185\u7530\u5553\u6cbb, \u6fa4\u7530\u79c0\u4e4b: \u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u30ef\u30a4\u30e4\u306e\u5fae\u632f\u52d5\u306b\u3088\u308b\u89e6\u899a\u611f\u899a\u306e\u5448\u793a\u3068\u6307\u5148\u611f\u899a\u30c1\u30a7\u30c3\u30ab\u300d,<br \/>\n\u6750\u6599\u8a66\u9a13\u6280\u8853, Vol. 61, No. 3, pp.195-196, 2016\u5e749\u6708<br \/>\n[10]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u30ef\u30a4\u30e4\u306e\u5fae\u5c0f\u632f\u52d5\u3092\u7528\u3044\u305f\u89e6\u611f\u5448\u793a\u3068\u89e6\u899a\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u3078\u306e\u5fdc\u7528\u300d,<br \/>\n<a href=\"http:\/\/www.nedia.or.jp\/ddf2016\/\">\u7b2c3\u56de \u96fb\u5b50\u30c7\u30d0\u30a4\u30b9\u30d5\u30a9\u30fc\u30e9\u30e0\u4eac\u90fd<\/a>, \u4eac\u90fd\u30ea\u30b5\u30fc\u30c1\u30d1\u30fc\u30af, 2016\u5e7411\u67082\u65e5(\u6c34)<br \/>\n[11]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5fae\u5c0f\u632f\u52d5\u5b50\u30a2\u30ec\u30a4\u3092\u7528\u3044\u305f\u89e6\u611f\u63d0\u793a\u6280\u8853\u3068\u89e6\u899a\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u306e\u69cb\u6210\u300d\u3001<br \/>\n<a href=\"http:\/\/www.gijutu.co.jp\/\">\u6280\u8853\u60c5\u5831\u5354\u4f1a<\/a>\u00a0<a href=\"http:\/\/www.gijutu.co.jp\/doc\/s_609433.htm\">\u89e6\u899a\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u306b\u5411\u3051\u305f\u932f\u89e6\u30e1\u30ab\u30cb\u30ba\u30e0\u3068\u89e6\u899a\u63d0\u793a\u6280\u8853<\/a>, 2016\u5e749\u670828\u65e5(\u6c34)<br \/>\n[12] Vo Nhu Thanh and Hideyuki Sawada: \u201cVietnamese Language Speech Performance by the Talking Robot\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u9023\u5408\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2016\u5e749\u670817\u65e5(\u571f), \u5fb3\u5cf6\u5927\u5b66, p.82, 2016<\/p>\n<p class=\"style3\"><span class=\"style4\"><strong>2015\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/strong><br \/>\n<\/span>\u3000 [1]\u00a0<span class=\"style8\">Best Paper Award<\/span>, Hideyuki Sawada and Yuto Takeda: \u201cTactile Pen for Presenting Texture Sensation from Touch Screen\u201d,<br \/>\nIEEE International Conference on Human System Interaction, pp. 334-339, 2015<br \/>\n[2] Hideyuki Sawada: \u201cA talking Robot and Its Autonomous Learning of Speech Articulation for Producing Expressive Speech\u201d,<br \/>\nin\u00a0<i>Emergent Trends in Robotics and Intelligent Systems<\/i>, Edited by Peter Sincak and Pitoyo Hartono, pp.93-102,<br \/>\nAdvances in Intelligent Systems and Computing, Volume 316, 2015, ISSN 2194-5357, ISBN 978-3-319-10782-0<br \/>\n[3] Hideyuki Sawada and Brice Renaudeau: \u201cA Communication System with a Mobile Robot by Presenting Tactile Apparent Movement Sensation\u201d,<br \/>\n2016 IEEE International Conference on Industrial Technology (ICIT2016), Taipei, Taiwan, pp. 1862-1865, 2016<br \/>\n[4] Vo Nhu Thanh and Hideyuki Sawada: \u201cAutonomous Vowels Sequence Reproduction of a Talking Robot Using PARCOR Coefficients\u201d,<br \/>\nCD-ROM Proceedings of International Conference on Electronics, Information and Communication (ICEIC2016), Danang, Vietnam, TM1-4, 2016<br \/>\n[5] Hideyuki Sawada: \u201cA Talking Robot and the Autonomous Learning of Speech Articulation for the Communication with Humans\u201d,<br \/>\nInternational Workshop on Speech Robotics, Dresden, Germany, 2015<br \/>\n[6]\u00a0<span class=\"style10\">\u5bc4\u7a3f<\/span>, \u6fa4\u7530\u79c0\u4e4b: \u201c\u89e6\u899a\u611f\u899a\u3092\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u3059\u308b \uff5e\u30c6\u30ec\u30d3\u3084\u96fb\u8a71\u306e\u5411\u3053\u3046\u5074\u3092\u300c\u89e6\u308b\u300d\u6280\u8853\uff5e\u201d,<br \/>\n\u767e\u5341\u56db\u7d4c\u6e08\u7814\u7a76\u6240 \u8abf\u67fb\u6708\u5831, No. 347, pp. 2-9, 2016<br \/>\n[7]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300cSMA\u30a2\u30af\u30c1\u30e5\u30a8\u30fc\u30bf\u306b\u3088\u308b\u89e6\u611f\u5448\u793a\u3068\u30bf\u30c3\u30c1\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u3078\u306e\u5fdc\u7528\u300d,<br \/>\n\u65e5\u672c\u6a5f\u68b0\u5b66\u4f1a 2015\u5e74\u5ea6\u5e74\u6b21\u5927\u4f1a \u5148\u7aef\u6280\u8853\u30d5\u30a9\u30fc\u30e9\u30e0\u300c\u4eba\u3068\u95a2\u308f\u308b\u30a2\u30af\u30c1\u30e5\u30a8\u30fc\u30bf\uff0e\u305d\u306e\u73fe\u72b6\u3068\u8ab2\u984c\u300d, \u5317\u6d77\u9053\u5927\u5b66, 2015\u5e749\u670816\u65e5(\u6c34)<br \/>\n[8]\u00a0<span class=\"style10\">\u8b1b\u7fd2\u4f1a\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u30ef\u30a4\u30e4\u306e\u5fae\u5c0f\u632f\u52d5\u30d1\u30bf\u30fc\u30f3\u3092\u5229\u7528\u3057\u305f\u89e6\u899a\u611f\u899a\u306e\u5448\u793a\u300d,<br \/>\n\u65e5\u672cVR\u5b66\u4f1a \u529b\u89e6\u899a\u306e\u63d0\u793a\u3068\u8a08\u7b97\u7814\u7a76\u4f1a \u89e6\u899a\u8b1b\u7fd2\u4f1a\u300c\u89e6\u899a\u6280\u8853\u306e\u57fa\u790e\u3068\u5fdc\u7528 \uff5e\u30d2\u30c8\u306e\u89e6\u899a\u7406\u89e3\u304b\u3089\u30d2\u30e5\u30fc\u30de\u30f3\u30de\u30b7\u30f3\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u3084\u30ed\u30dc\u30c3\u30c8\u3078\u306e\u5fdc\u7528\u307e\u3067\uff5e\u300d, \u7acb\u547d\u9928\u5927\u5b66 \u3073\u308f\u3053\u30fb\u304f\u3055\u3064\u30ad\u30e3\u30f3\u30d1\u30b9 \u30a8\u30dd\u30c3\u30af\u7acb\u547d21, 2015\u5e7411\u670827\u65e5(\u91d1)<br \/>\n[9]\u00a0<span class=\"style10\">Invited Lecture<\/span>, Hideyuki Sawada: \u201cHuman-machine interface as communication means for intelligent robots\u201d,<br \/>\n\u570b\u7acb\u4e2d\u6b63\u5927\u5b78 \u8a8d\u77e5\u79d1\u5b78\u4e2d\u5fc3, \u53f0\u6e7e, Wednesday 16th March, 2016<br \/>\n[10]\u00a0<span class=\"style10\">Invited Lecture<\/span>, Hideyuki Sawada: \u201cHuman-machine interface as communication means for intelligent robots\u201d,<br \/>\n\u570b\u7acb\u5609\u7fa9\u5927\u5b78 \u96fb\u6a5f\u884c\u7a0b\u5b78\u7cfb, \u53f0\u6e7e, Thursday 17th March, 2016<br \/>\n[11] \u6fa4\u7530\u79c0\u4e4b:\u300c\u89e6\u899a\u611f\u899a\u306e\u5448\u793a\u3068\u30bf\u30c3\u30c1\u30fb\u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f3\u3078\u306e\u5c55\u958b\u300d,<br \/>\n\u611f\u5bdf\u5de5\u5b66\u7814\u7a76\u4f1a, \u9999\u5ddd, 2015\u5e749\u67084\u65e5(\u91d1)<br \/>\n[12] Vo Nhu Thanh and Hideyuki Sawada: \u201cA Real Time Visualization System for Articulatory Analysis of the Talking Robot\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u9023\u5408\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2015\u5e749\u670826\u65e5(\u571f), p. 86, 2015<br \/>\n[13] Yuan Lu and Hideyuki Sawada: \u201cWristband Camera and the Dynamic Hand Movement Analysis\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u9023\u5408\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2015\u5e749\u670826\u65e5(\u571f), p. 204, 2015<br \/>\n[14] Brice Renaudeau and Hideyuki Sawada: \u201cA Communication System with a Mobile Robot by Presenting Tactile Apparent Movement Sensation\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u9023\u5408\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2015\u5e749\u670826\u65e5(\u571f), p. 288, 2015<\/p>\n<p class=\"style3\"><span class=\"style4\"><strong>2014\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/strong><br \/>\n<\/span><span class=\"style6\">\u3000 [1] Hideyuki Sawada: \u201cA Talking Robot and the Expressive Speech Communication with Human\u201d,<br \/>\nInternational Journal of Affective Engineering, Vol.14, No.2, pp.95-102, 2015<br \/>\n[2]\u00a0<span class=\"style8\">Student Paper Award<\/span>, Yoshiaki Iwatani and Hideyuki Sawada: \u201cTactile Glove Using SMA Wires for Presenting Pseudo-tactile Sensations in VR Space\u201d,<br \/>\n2015 RISP International Workshop on Nonlinear Circuits, Communications and Signal Processing, pp. 166-169, 2015<br \/>\n[3] Hideyuki Sawada, Keiji Uchida, Junichi Danjo and Yu Nakamura: \u201cA Screening Device of Diabetic Peripheral Neuropathy Based on the Perception of Micro-vibration Patterns\u201d,<br \/>\n2015 RISP International Workshop on Nonlinear Circuits, Communications and Signal Processing, pp. 162-165, 2015<br \/>\n[4] Hideyuki Sawada and Guangyi Zhu: \u201cPresenting Tactile Stroking Sensations from a Touch Screen Using the Cutaneous Rabbit Illusion\u201d,<br \/>\nMecatronics2014, pp. 371-376, 2014<br \/>\n[5] Hideyuki Sawada and Potsawat Boonjaipetch: \u201cTactile Pad for the Presentation of Tactile Sensation from Moving Pictures\u201d,<br \/>\nInternational Conference on Human System Interaction, pp. 135-140, 2014<br \/>\n[6] Shohei Kitano and Hideyuki Sawada: \u201cDevelopment of a Haptic Device for Tactual Search in Virtual Environment\u201d,<br \/>\n5th Chiang Mai University \u2013 Kagawa University Joint Symposium, pp. 38-39, 2014<br \/>\n[7] Hiroki Taomoto and Hideyuki Sawada: \u201cInteraction with Autonomous Mobile Robot by Body Gesture\u201d,<br \/>\n5th Chiang Mai University \u2013 Kagawa University Joint Symposium, pp. 110-111, 2014<br \/>\n[8]\u00a0<span class=\"style10\">\u7814\u7a76\u5c55\u793a<\/span>, \u300c\u30b8\u30a7\u30b9\u30c1\u30e3\u30fb\u30bf\u30c3\u30c1\u306b\u3088\u308b\u30ed\u30dc\u30c3\u30c8\u3068\u306e\u30b3\u30df\u30e5\u30cb\u30b1\u30fc\u30b7\u30e7\u30f3\u300d\u3001<br \/>\n<a href=\"http:\/\/discovery-labo.jp\/\">\u30c7\u30a3\u30b9\u30ab\u30d0\u30ea\u30fc\u30e9\u30dc ISHIKAWA 2014<\/a>\u3001\u77f3\u5ddd\u770c\u7523\u696d\u5c55\u793a\u9928 1\u53f7\u9928\u30012014\u5e7411\u67088\u65e5(\u571f)\uff5e9\u65e5(\u65e5)<br \/>\n[9]\u00a0<span class=\"style10\">\u8b1b\u7fd2\u4f1a\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300cSMA\u30a2\u30af\u30c1\u30e5\u30a8\u30fc\u30bf\u3092\u7528\u3044\u305f\u89e6\u899a\u306e\u5e7b\u899a\u751f\u8d77\u3068\u89e6\u899a\u611f\u899a\u306e\u5448\u793a\u300d,<br \/>\nAsiaHaptics2014 \u89e6\u899a\u8b1b\u7fd2\u4f1a\u300c\u89e6\u899a\u6280\u8853\u306e\u57fa\u790e\u3068\u5fdc\u7528 \uff5e\u30d2\u30c8\u306e\u89e6\u899a\u7406\u89e3\u304b\u3089\u30d2\u30e5\u30fc\u30de\u30f3\u30de\u30b7\u30f3\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u3084\u30ed\u30dc\u30c3\u30c8\u3078\u306e\u5fdc\u7528\u307e\u3067\uff5e\u300d, \u30a8\u30dd\u30ab\u30eb\u3064\u304f\u3070, 2014\u5e7411\u670818\u65e5(\u706b)<br \/>\n[10]\u00a0<span class=\"style8\">Best English Presentation Award<\/span>, Yuan Sui and Hideyuki Sawada: \u201cTouching and feeling a virtual object with TactileGlove\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u9023\u5408\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2014\u5e749\u670813\u65e5(\u571f), 2014<br \/>\n[11] Mao Matsumura and Hideyuki Sawada: \u201cMarker-based Gestural Control of a Drone\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u9023\u5408\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2014\u5e749\u670813\u65e5(\u571f), 2014<br \/>\n[12] Yoshiaki Iwatani and Hideyuki Sawada: \u201cAn Experimental System for Presenting Tactile Penetrating Sensation\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u9023\u5408\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2014\u5e749\u670813\u65e5(\u571f), 2014<br \/>\n[13] \u6fa4\u4e95\u53f2, \u4e09\u6728\u6b66\u5bdb, \u5ca9\u5d0e\u662d\u61b2, \u5927\u6797\u7531\u7f8e\u5b50, \u4e09\u5b85\u5b9f, \u5c0f\u8c37\u4e00\u5b54, \u515a\u5065\u6b66, \u6fa4\u7530\u79c0\u4e4b, \u78ef\u90e8\u6b63\u5229, \u677e\u4e95\u7fa9\u90ce:\u300c\u30d0\u30fc\u30c1\u30e3\u30eb\u30ea\u30a2\u30ea\u30c6\u30a3\u30b7\u30b9\u30c6\u30e0\u3092\u7528\u3044\u305f\u820c\u764c\u90e8\u5206\u5207\u9664\u8a13\u7df4\u30b7\u30b9\u30c6\u30e0\u306e\u958b\u767a\u300d,<br \/>\n\u7b2c59\u56de\u65e5\u672c\u53e3\u8154\u5916\u79d1\u5b66\u4f1a\u7dcf\u4f1a\u30fb\u5b66\u8853\u96c6\u4f1a, 2014.10<br \/>\n[14] \u6fa4\u4e95\u53f2, \u4e09\u6728\u6b66\u5bdb, \u5ca9\u5d0e\u662d\u61b2, \u5927\u6797\u7531\u7f8e\u5b50, \u4e09\u5b85\u5b9f, \u5c0f\u8c37\u4e00\u5b54, \u515a\u5065\u6b66, \u6fa4\u7530\u79c0\u4e4b, \u78ef\u90e8\u6b63\u5229, \u677e\u4e95\u7fa9\u90ce:\u300c\u30d0\u30fc\u30c1\u30e3\u30eb\u30ea\u30a2\u30ea\u30c6\u30a3\u3092\u7528\u3044\u305f\u820c\u764c\u90e8\u5206\u5207\u9664\u8a13\u7df4\u30b7\u30b9\u30c6\u30e0\u306e\u958b\u767a\u300d,<br \/>\n\u7b2c68\u56deNPO\u6cd5\u4eba \u65e5\u672c\u53e3\u8154\u79d1\u5b66\u4f1a\u5b66\u8853\u96c6\u4f1a, 2014.05<br \/>\n<\/span><\/p>\n<p class=\"style3\"><span class=\"style4\"><strong>2013\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/strong><br \/>\n<\/span><span class=\"style6\">\u3000 [1] Changan Jiang, Keiji Uchida and Hideyuki Sawada, \u201cResearch and Development of Vison Based Tactile Display System Using Shape Memory Alloys\u201d,<br \/>\nInternational Journal of Innovative Computing, Information and Control, Vol.10, No.3, pp. 837-850, 2014<br \/>\n[2] Yuto Takeda and Hideyuki Sawada, \u201cTactile Actuators Using SMA Micro-wires and the Generation of Texture Sensation from Images\u201d,<br \/>\nIEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS2013), pp. 2017-2022, 2013<br \/>\n[3] Hideyuki Sawada and Chika Udaka: \u201cA Robotic Auditory System for Imitating Human Listening Behavior\u201d,<br \/>\nIEEE International Conference on Mechatronics and Automation, pp. 773-778, 2013<br \/>\n[4] Hideyuki Sawada, Yu Nakamura, Yuto Takeda and Keiji Uchida: \u201cMicro-vibration Array Using SMA actuators for the Screening of Diabetes\u201d,<br \/>\nInternational Conference on Human System Interaction, pp. 620-625, 2013<br \/>\n[5] Mitsuki Kitani and Hideyuki Sawada: \u201cMechanical Reproduction of Human-like Expressive Speech Using a Talking Robot\u201d,<br \/>\nInternational Conference on Biometrics and Kansei Engineering, pp.229-234, 2013<br \/>\n[6]\u00a0<span class=\"style10\">Plenary Talk<\/span>, Hideyuki Sawada: \u201cA talking Robot and Its Autonomous Learning of Speech Articulation for Producing Expressive Speech\u201d,<br \/>\nSYMPOSIUM ON EMERGENT TRENDS IN ARTIFICIAL INTELLIGENCE &amp; ROBOTICS, September 15-17, 2013, Kosice, SLOVAKIA, 2013<br \/>\n[7]\u00a0<span class=\"style10\">Exhibitions and Demonstrations<\/span>, \u201cTALKING ROBOT\u201d,<br \/>\n<a href=\"http:\/\/ktj.in\/\">Kshitij 2014<\/a>\u00a0\u2013 Asia\u2019s Largest Techno-Management Fest,\u00a0<a href=\"http:\/\/www.iitkgp.ac.in\/\">Indian Institute of Technology Kharagpur<\/a>, Kharagpur, INDIA, January 31st \u2013 February 3rd, 2014<br \/>\n[8]\u00a0<span class=\"style10\">\u7814\u7a76\u5c55\u793a<\/span>, \u300c\u30b8\u30a7\u30b9\u30c1\u30e3\u30fb\u89e6\u899a\u3092\u4f7f\u3063\u305f\u30ed\u30dc\u30c3\u30c8\u3068\u306e\u30b3\u30df\u30e5\u30cb\u30b1\u30fc\u30b7\u30e7\u30f3\u300d\u3001<br \/>\n<a href=\"http:\/\/www.yumemirai.jp\/\">\u3044\u3057\u304b\u308f\u201d\u5922\u201d\u672a\u6765\u535a<\/a>\u3001\u77f3\u5ddd\u770c\u7523\u696d\u5c55\u793a\u9928 1\u53f7\u9928\u30012013\u5e7411\u67089\u65e5(\u571f)\uff5e10\u65e5(\u65e5)<br \/>\n[9]\u00a0<a href=\"http:\/\/www.stbook.co.jp\/products\/detail.php?product_id=263\">\u89e6\u899a\u8a8d\u8b58\u30e1\u30ab\u30cb\u30ba\u30e0\u3068\u5fdc\u7528\u6280\u8853-\u89e6\u899a\u30bb\u30f3\u30b5\u30fb\u89e6\u899a\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4-\u3010\u5897\u88dc\u7248\u3011<\/a>, \u30b5\u30a4\u30a8\u30f3\u30b9&amp;\u30c6\u30af\u30ce\u30ed\u30b8\u30fc, 2014\u5e743\u670819\u65e5 \u767a\u884c, ISBN 978-4-907002-37-4, \u300c\u5fae\u5c0f\u632f\u52d5\u5b50\u30a2\u30ec\u30a4\u3092\u7528\u3044\u305f\u89e6\u899a\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u3068\u89e6\u611f\u899a\u306e\u63d0\u793a\u300d\u3092\u57f7\u7b46, pp. 497-509, 2014<br \/>\n[10] \u5ca9\u8c37\u4eae\u660e, \u6fa4\u7530\u79c0\u4e4b:\u300cVR\u30a8\u30f3\u30bf\u30c6\u30a4\u30e1\u30f3\u30c8\u306b\u5411\u3051\u305f\u30a8\u30a2\u697d\u5668\u6f14\u594f\u30b7\u30b9\u30c6\u30e0\u300d,<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f32014 \u8ad6\u6587\u96c6, pp. 587-592, 2013<br \/>\n[11] \u6731 \u5e83\u6bc5, \u6fa4\u7530\u79c0\u4e4b:\u300c\u632f\u52d5\u30d1\u30bf\u30fc\u30f3\u3068\u8996\u899a\u60c5\u5831\u306e\u540c\u6642\u523a\u6fc0\u306b\u3088\u308bCutaneous Rabbit Illusion\u52b9\u679c\u306e\u5411\u4e0a\u300d,<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u7b2c14\u56de\u30b7\u30b9\u30c6\u30e0\u30a4\u30f3\u30c6\u30b0\u30ec\u30fc\u30b7\u30e7\u30f3\u90e8\u9580\u8b1b\u6f14\u4f1a(SI2013)\u30012013\u5e7412\u670818\u65e5\uff5e20\u65e5, pp. 1575-1578, 2013<br \/>\n[12] Yoshiaki Iwatani and Hideyuki Sawada: \u201cImmersive Air Guitar System Presenting Tactile Sensations\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u9023\u5408\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2013\u5e749\u670821\u65e5(\u571f), p.303, 2013<br \/>\n[13] Guangyi Zhu and Hideyuki Sawada: \u201cPresenting the \u201cCutaneous Rabbit\u201d Illusion with Visual Stimulation\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u9023\u5408\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2013\u5e749\u670821\u65e5(\u571f), p.304, 2013<br \/>\n<\/span><\/p>\n<p class=\"style3\"><span class=\"style4\"><strong>2012\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/strong><br \/>\n<\/span><span class=\"style6\">\u3000 [1] Feng ZHAO, Changan JIANG and Hideyuki SAWADA: \u201cA Novel Braille Display Using the Vibration of SMA Wires and the Evaluation of Braille Presentations\u201d,<br \/>\nJournal of Biomechanical Science and Engineering, Vol.7, No.4, pp.416-432, 2012<br \/>\n[2] \u5e73\u4e95\u512a\u53f8, \u9577\u4e95\u7f8e\u548c, \u6a2a\u4e95\u82f1\u4eba, \u6fa4\u7530\u79c0\u4e4b:\u300c\u30aa\u30f3\u30c8\u30ed\u30b8\u30fc\u306b\u5bfe\u5fdc\u3057\u305f\u96fb\u5b50\u30ab\u30eb\u30c6\u5165\u529b\u652f\u63f4\u30b7\u30b9\u30c6\u30e0\u3068\u533b\u7642\u73fe\u5834\u3067\u306e\u97f3\u58f0\u5165\u529b\u5b9f\u9a13\u300d,<br \/>\n\u533b\u7642\u60c5\u5831\u5b66, \u7b2c32\u5dfb, \u7b2c2\u53f7, pp.73-81, 2012<br \/>\n[3] Shinji Okumoto, Feng Zhao and Hideyuki Sawada: \u201cTactoGlove Presenting Tactile Sensations for Intuitive Gestural Interaction\u201d,<br \/>\n21st IEEE International Symposium on Industrial Electronics, pp.1680-1685, 2012<br \/>\n[4] Hideyuki Sawada, Feng Zhao and Keiji Uchida: \u201cDisplaying Braille for Mobile Use with the Micro-vibration of SMA Wires\u201d,<br \/>\nInternational Conference on Human System Interaction, CD-ROM Proceedings, 2012<br \/>\n[5] Mitsuki Kitani and Hideyuki Sawada: \u201cReproduction of plosive sound vocalization by the talking robot based on the visual information\u201d,<br \/>\nInternational Conference on Disability, Virtual Reality and Associated Technologies, pp.459-462, 2012<br \/>\n[6]\u00a0<span class=\"style10\">\u62db\u5f85\u8b1b\u6f14<\/span>, Hideyuki Sawada: \u201cA talking and singing robot: Mechanical construction of a human vocal system for the face-to-face vocal communication\u201d,<br \/>\nROBOTIC EXPERIENCE in Italy and Japan, JAPAN DAYS 2012, Saturday 12th May, 2012<br \/>\n[7]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b:\u300c\u89e6\u899a\u5448\u793a\u6280\u8853\u304b\u3089\u898b\u305f\u6b21\u4e16\u4ee3\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u3068\u4eba\u9593\u652f\u63f4\u300d\u3001<br \/>\n\u65b0\u5316\u5b66\u6280\u8853\u63a8\u9032\u5354\u4f1a\u30012012\u5e745\u670824\u65e5(\u6728)<br \/>\n[8]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u3092\u5229\u7528\u3057\u305f\u89e6\u899a\u5448\u793a\u3068\u6307\u5148\u611f\u899a\u691c\u67fb\u88c5\u7f6e\u300d\u3001<br \/>\n\u30a4\u30ce\u30d9\u30fc\u30b7\u30e7\u30f3\u30fb\u30b8\u30e3\u30d1\u30f32020-\u5927\u5b66\u898b\u672c\u5e02 JST\u30b7\u30e7\u30fc\u30c8\u30d7\u30ec\u30bc\u30f3\u30c6\u30fc\u30b7\u30e7\u30f3\u3001\u6771\u4eac\u56fd\u969b\u30d5\u30a9\u30fc\u30e9\u30e0\u30012012\u5e749\u670828\u65e5(\u91d1)<br \/>\n[9]\u00a0<span class=\"style10\">\u7814\u7a76\u5c55\u793a<\/span>, \u300c\u4eba\u306e\u6a5f\u80fd\u3092\u30ed\u30dc\u30c3\u30c8\u6280\u8853\u3067\u518d\u73fe\u3059\u308b\u300d\u3001<br \/>\n<a href=\"http:\/\/www.yumemirai.jp\/\">\u3044\u3057\u304b\u308f\u201d\u5922\u201d\u672a\u6765\u535a<\/a>\u3001\u77f3\u5ddd\u770c\u7523\u696d\u5c55\u793a\u9928 1\u53f7\u9928\u30012012\u5e7411\u670810\u65e5(\u571f)\uff5e11\u65e5(\u65e5)<br \/>\n[10] Nattapong Swangmuang, Kasemsak Uthaichana, Hideyuki Sawada, Nipon Theera-Umpon: \u201cDevelopment of Translucent Mangosteen Classification by Acoustic-based Sensing\u201d,<br \/>\n<a href=\"http:\/\/www.kagawa-u.ac.jp\/ku-cmu-sympo\/\">4th Kagawa University \u2013 Chiang Mai University Joint Symposium 2012 -Healthy Aging and Sustainable Society-<\/a>, p.39, 2012<br \/>\n[11] Mitsuki Kitani, Nattapong Swangmuang, Kasemsak Uthaichana, Nipon Theera-Umpon and Hideyuki Sawada: \u201cTowards the Non-destructive Inspection of Translucent Mangosteens: First Trial by Employing Acoustic Signal Response and SOM Classification\u201d,<br \/>\n<a href=\"http:\/\/www.kagawa-u.ac.jp\/ku-cmu-sympo\/\">4th Kagawa University \u2013 Chiang Mai University Joint Symposium 2012 -Healthy Aging and Sustainable Society-<\/a>, p.40, 2012<br \/>\n[12] \u5ca9\u8c37\u4eae\u660e, \u6fa4\u7530\u79c0\u4e4b:\u300c\u89e6\u899a\u5448\u793a\u3092\u5099\u3048\u305f\u6ca1\u5165\u578b\u30a8\u30a2\u30ae\u30bf\u30fc\u30b7\u30b9\u30c6\u30e0\u300d,<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f32013 \u8ad6\u6587\u96c6, pp. 352-355, 2013<br \/>\n[13] \u6731 \u5e83\u6bc5, \u6fa4\u7530\u79c0\u4e4b:\u300c\u624b\u638c\u3078\u306eCutaneous Rabbit\u73fe\u8c61\u306e\u5448\u793a\u300d,<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u7b2c13\u56de\u30b7\u30b9\u30c6\u30e0\u30a4\u30f3\u30c6\u30b0\u30ec\u30fc\u30b7\u30e7\u30f3\u90e8\u9580\u8b1b\u6f14\u4f1a(SI2012)\u30012012\u5e7412\u670818\u65e5\uff5e20\u65e5, pp. 2048-2011, 2012<br \/>\n[14] \u6b66\u7530\u512a\u6597, \u6fa4\u7530\u79c0\u4e4b:\u300c\u753b\u50cf\u7279\u5fb4\u306e\u81ea\u52d5\u62bd\u51fa\u306b\u3088\u308b\u89e6\u899a\u611f\u899a\u306e\u5448\u793a\u300d,<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u7b2c13\u56de\u30b7\u30b9\u30c6\u30e0\u30a4\u30f3\u30c6\u30b0\u30ec\u30fc\u30b7\u30e7\u30f3\u90e8\u9580\u8b1b\u6f14\u4f1a(SI2012)\u30012012\u5e7412\u670818\u65e5\uff5e20\u65e5, pp. 2052-2055, 2012<br \/>\n[15] \u77f3\u4e0a\u967d\u4e00, \u6fa4\u7530\u79c0\u4e4b:\u300c\u30e9\u30f3\u30c0\u30e0\u6642\u9593\u9045\u5ef6\u30d5\u30a3\u30eb\u30bf\u3092\u7528\u3044\u305f\u30cf\u30a6\u30ea\u30f3\u30b0\u30ad\u30e3\u30f3\u30bb\u30e9\u306e\u69cb\u7bc9\u300d,<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u7b2c13\u56de\u30b7\u30b9\u30c6\u30e0\u30a4\u30f3\u30c6\u30b0\u30ec\u30fc\u30b7\u30e7\u30f3\u90e8\u9580\u8b1b\u6f14\u4f1a(SI2012)\u30012012\u5e7412\u670818\u65e5\uff5e20\u65e5, pp. 2731-2734, 2012<br \/>\n[16] \u5b87\u9ad8 \u9759, \u6fa4\u7530\u79c0\u4e4b:\u300c\u4eba\u9593\u306e\u632f\u308b\u821e\u3044\u3092\u6a21\u5023\u3057\u305f\u97f3\u97ff\u8074\u53d6\u30ed\u30dc\u30c3\u30c8\u306e\u69cb\u7bc9\u300d,<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u7b2c13\u56de\u30b7\u30b9\u30c6\u30e0\u30a4\u30f3\u30c6\u30b0\u30ec\u30fc\u30b7\u30e7\u30f3\u90e8\u9580\u8b1b\u6f14\u4f1a(SI2012)\u30012012\u5e7412\u670818\u65e5\uff5e20\u65e5, pp. 2735-2738, 2012<br \/>\n[17] \u5bae\u5185\u5275, \u5c71\u4e0b\u96c5\u5f18, \u5185\u7530\u5553\u6cbb, \u6fa4\u7530\u79c0\u4e4b:\u300c\u30d5\u30a1\u30a4\u30d0\u30ec\u30fc\u30b6\u3092\u7528\u3044\u305fTi-Ni\u7d30\u7dda\u3068Cu\u7d30\u7dda\u306e\u5fae\u7d30\u6eb6\u63a5\u300d,<br \/>\n2012\u5e74\u5ea6\u7cbe\u5bc6\u5de5\u5b66\u4f1a\u4e2d\u56fd\u56db\u56fd\u652f\u90e8 -\u5cf6\u6839\u5730\u65b9\u5b66\u8853\u8b1b\u6f14\u4f1a-\u30012012\u5e7410\u67085\u65e5(\u91d1)\uff5e6\u65e5(\u571f), pp. 21-22, 2012<br \/>\n[18] \u5bae\u5185\u5275, \u5c71\u4e0b\u96c5\u5f18, \u5185\u7530\u5553\u6cbb, \u6fa4\u7530\u79c0\u4e4b:\u300c\u30d5\u30a1\u30a4\u30d0\u30ec\u30fc\u30b6\u3092\u7528\u3044\u305fTi-Ni\u7d30\u7dda\u3068Cu\u7d30\u7dda\u306e\u5fae\u7d30\u6eb6\u63a5\u300d,<br \/>\n\u7523\u696d\u6280\u8853\u9023\u643a\u63a8\u9032\u4f1a\u8b70 \u88fd\u9020\u30d7\u30ed\u30bb\u30b9\u90e8\u4f1a \u7cbe\u5bc6\u5fae\u7d30\u52a0\u5de5\u5206\u79d1\u4f1a\u3000\u5e73\u621024\u5e74\u5ea6\u91d1\u578b\u30fb\u6750\u6599\u7814\u7a76\u4f1a\u30012012\u5e7411\u67088\u65e5(\u6728), pp. 58-59, 2012<br \/>\n[19]\u00a0<span class=\"style8\">Best English Presentation Award<\/span>, Guangyi Zhu and Hideyuki Sawada: \u201cPresenting the Sequential \u201cCutaneous Rabbit\u201d Illusion on Palm\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u9023\u5408\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2012\u5e749\u670829\u65e5(\u571f), p.346, 2012<br \/>\n[20] Mitsuki Kitani and Hideyuki Sawada: \u201cHuman-like Expressive Speech Production by a Talking Robot\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u9023\u5408\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2012\u5e749\u670829\u65e5(\u571f), p.114, 2012<br \/>\n[21] Chika Udaka and Hideyuki Sawada: \u201cImitating Human Listening Behavior by Using a Robotic Auditory System\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u9023\u5408\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2012\u5e749\u670829\u65e5(\u571f), p.116, 2012<br \/>\n[22] Yoshiaki Iwatani and Hideyuki Sawada: \u201cThe air guitar system with tactile presentation\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u9023\u5408\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2012\u5e749\u670829\u65e5(\u571f), p.229, 2012<br \/>\n[23] Yuto Takeda and Hideyuki Sawada: \u201cA Tactile Pen and the Presentation of Tactile Sensations from a Touch Screen\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u9023\u5408\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2012\u5e749\u670829\u65e5(\u571f), p.345, 2012<br \/>\n[24] \u5ca9\u8c37\u4eae\u660e, \u6fa4\u7530\u79c0\u4e4b:\u300c\u89e6\u899a\u5448\u793a\u3092\u5099\u3048\u305f\u6ca1\u5165\u578b\u30a8\u30a2\u30ae\u30bf\u30fc\u30b7\u30b9\u30c6\u30e0\u300d,<br \/>\n<a href=\"http:\/\/www.microsoft.com\/ja-jp\/ijarc\/event\/kinect_workshop.aspx\">\u7b2c2\u56de Microsoft Kinect for Windows Workshop<\/a>, \u65e5\u672c\u79d1\u5b66\u672a\u6765\u9928, 2013\u5e742\u670828\u65e5(\u6728)<br \/>\n<\/span><\/p>\n<p class=\"style3\"><span class=\"style4\"><strong>2011\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/strong><br \/>\n<\/span><span class=\"style6\">\u3000 [1] \u798f\u5c71\u60e0\u58eb, \u6fa4\u7530\u79c0\u4e4b:\u300c\u8996\u89e6\u899a\u306e\u540c\u6642\u523a\u6fc0\u306b\u3088\u308b\u30c6\u30af\u30b9\u30c1\u30e3\u611f\u899a\u5448\u793a\u30b7\u30b9\u30c6\u30e0\u306e\u69cb\u7bc9\u3068\u305d\u306e\u8a55\u4fa1\u300d,<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a\u8ad6\u6587\u8a8c, Vol.52, No.4, pp.1562-1570, 2011<br \/>\n[2] \u6728\u8c37\u5149\u6765, \u539f\u9054\u77e2, \u6fa4\u7530\u79c0\u4e4b:\u300cDual-SOM\u306e\u4f4d\u76f8\u69cb\u9020\u5b66\u7fd2\u306b\u57fa\u3065\u304f\u767a\u8a71\u30ed\u30dc\u30c3\u30c8\u306e\u81ea\u5f8b\u7684\u97f3\u58f0\u7372\u5f97\u300d,<br \/>\n\u65e5\u672c\u6a5f\u68b0\u5b66\u4f1a\u8ad6\u6587\u96c6 C\u7de8, Vol.77, No.775, pp.1062-1070, 2011<br \/>\n[3] Mitsuki Kitani, Tatusya Hara and Hideyuki Sawada, \u201cVoice articulatory training with a talking robot for the auditory impaired\u201d,<br \/>\nInternational Journal on disability and human development, Vol.10, No.1, pp.63-67, 2011<br \/>\n[4] Rosdiyana Samad and Hideyuki Sawada, \u201cExtraction of the minimum number of Gabor wavelet parameters for the recognition of natural facial expressions\u201d,<br \/>\nInternational Journal of Artificial Life and Robotics, Vol.16, No.1, pp.21-31, 2011<br \/>\n[5] Mitsuki Kitani, Tatsuya Hara, Hiroki Hanada and Hideyuki Sawada, \u201cA Talking Robot and Its Singing Performance by the Mimicry of Human Vocalization\u201d,<br \/>\nHuman-Computer Systems Interaction: Backgrounds and Applications Part 2, Advances in Intelligent and Soft Computing, Volume 99, pp.57-73, ISSN 1867-5662, 2012<br \/>\n[6]\u00a0<span class=\"style8\">Best Paper Finalist<\/span>,<br \/>\nMuhamad Hafiz and Hideyuki Sawada: \u201cPresentation of Button Repulsive Sensations on Touch Screen Using SMA Wires\u201d,<br \/>\nIEEE International Conference on Mechatronics and Automation, pp. 1-6, 2011<br \/>\n[7]\u00a0<span class=\"style8\">Best Paper Award<\/span>\u00a0in the area of Tactile and Haptic Interfaces,<br \/>\nChangan Jiang, Feng Zhao, Keiji Uchida and Hideyuki Sawada: \u201cResearch and Development on Portable Braille Display Using Shape Memory Alloy Wires\u201d,<br \/>\nInternational Conference on Human System Interaction, pp.318-323, 2011<br \/>\n[8] Mitsuki Kitani, Tatsuya Hara, Hiroki Hanada and Hideyuki Sawada: \u201cA Talking Robot and Its Human-like Expressive Speech Production\u201d,<br \/>\nInternational Conference on Human System Interaction, pp.203-208, 2011<br \/>\n[9] Hideyuki Sawada and Mitsuki Kitani: \u201cSeveral Approaches to Speech and Auditory Systems for Human System Interactions\u201d,<br \/>\nInternational Conference on Human System Interaction, pp.409-414, 2011<br \/>\n[10] Rosdiyana Samad and Hideyuki Sawada: \u201cEdge-Based Facial Feature Extraction Using Gabor Wavelet and Convolution Filters\u201d,<br \/>\n12th IAPR Conference on Machine Vision Applications, pp.430-433, 2011<br \/>\n[11] Hideyuki Sawada, Chang\u2019an Jiang and Hirofumi Takase: \u201cTactoGlove \u2013 Displaying Tactile Sensations in Tacto-gestural Interaction -\u201c,<br \/>\nInternational Conference on Biometrics and Kansei Engineering, pp.216-221, 2011<br \/>\n[12] Changan Jiang, Keiji Uchida and Hideyuki Sawada: \u201cDevelopment of Vision based Tactile Display System using Shape Memory Alloys\u201d,<br \/>\nInternational Conference on Advanced Mechatronic Systems, pp.570-575, 2011<br \/>\n[13] Hathaichanok Thavichai, Nattapong Swangmuang, Kasemsak Uthaichana, Nipon Theera-Umpon, Hideyuki Sawada, and Tanachai Pankasemsuk: \u201cA Community-driven Research Initiative: Acoustic Technology for Non-Destructive Evaluation to Increase Export Value of Mangosteens\u201d,<br \/>\n11th Annual SEAAIR Conference, pp.75-79, 2011<br \/>\n[14]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u4eba\u3068\u30ed\u30dc\u30c3\u30c8\u306e\u97f3\u58f0\u30b3\u30df\u30e5\u30cb\u30b1\u30fc\u30b7\u30e7\u30f3\u3068\u767a\u8a71\u30ed\u30dc\u30c3\u30c8\u300d\u3001<br \/>\nROBOTECH \u6b21\u4e16\u4ee3\u30ed\u30dc\u30c3\u30c8\u88fd\u9020\u6280\u8853\u5c55\u3001\u6771\u4eac\u30d3\u30c3\u30b0\u30b5\u30a4\u30c8\u30012011\u5e747\u670813\u65e5(\u6c34)<br \/>\n[15]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u8996\u899a\u30fb\u8074\u899a\u30fb\u89e6\u899a\u3092\u901a\u3057\u305f\u4eba\u3068\u30ed\u30dc\u30c3\u30c8\u306e\u30b3\u30df\u30e5\u30cb\u30b1\u30fc\u30b7\u30e7\u30f3\u6280\u8853\u300d\u3001<br \/>\nMEMS\u30a4\u30ce\u30d9\u30fc\u30b7\u30e7\u30f3\u30ef\u30fc\u30af\u30b7\u30e7\u30c3\u30d7\u3001\u6771\u4eac\u30d3\u30c3\u30b0\u30b5\u30a4\u30c8\u30012011\u5e747\u670815\u65e5(\u91d1)<br \/>\n[16]\u00a0<span class=\"style10\">\u8b1b\u7fd2\u4f1a\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u89e6\u899a\u30d5\u30a3\u30fc\u30c9\u30d0\u30c3\u30af\u6280\u8853\u306e\u958b\u767a\u3068\u30bf\u30c3\u30c1\u30d1\u30cd\u30eb\u30fb\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u3078\u306e\u5fdc\u7528\u300d\u3001<br \/>\n\u60c5\u5831\u6a5f\u69cb\u30012011\u5e7410\u670814\u65e5(\u91d1)<br \/>\n[17]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u8996\u899a\u30fb\u8074\u899a\u30fb\u89e6\u899a\u3092\u901a\u3057\u305f\u4eba\u3068\u30ed\u30dc\u30c3\u30c8\u306e\u30b3\u30df\u30e5\u30cb\u30b1\u30fc\u30b7\u30e7\u30f3\u6280\u8853\u300d\u3001<br \/>\n2011\u56fd\u969b\u30ed\u30dc\u30c3\u30c8\u5c55 RT\u4ea4\u6d41\u30d7\u30e9\u30b6\u3001\u6771\u4eac\u30d3\u30c3\u30b0\u30b5\u30a4\u30c8\u30012011\u5e7411\u67089\u65e5(\u6c34)\uff5e12\u65e5(\u571f)<br \/>\n[18] \u5e73\u4e95\u512a\u53f8, \u9577\u4e95\u7f8e\u548c, \u6a2a\u4e95\u82f1\u4eba, \u6fa4\u7530\u79c0\u4e4b:\u300c\u30aa\u30f3\u30c8\u30ed\u30b8\u30fc\u306b\u5bfe\u5fdc\u3057\u305f\u96fb\u5b50\u30ab\u30eb\u30c6\u5165\u529b\u652f\u63f4\u30b7\u30b9\u30c6\u30e0\u3068\u533b\u7642\u73fe\u5834\u3067\u306e\u97f3\u58f0\u5165\u529b\u5b9f\u9a13\u300d,<br \/>\n<a href=\"http:\/\/www.ho.chiba-u.ac.jp\/jami2011symp\/\">\u7b2c15\u56de\u65e5\u672c\u533b\u7642\u60c5\u5831\u5b66\u4f1a\u6625\u5b63\u5b66\u8853\u5927\u4f1a<\/a>\u00a0\u8b1b\u6f14\u8ad6\u6587\u96c6 B2-1, 2011\u5e746\u670817\u65e5(\u91d1)\uff5e18\u65e5(\u571f)<br \/>\n[19] \u6728\u8c37\u5149\u6765, \u5927\u4e45\u4fdd\u548c\u54c9, \u6fa4\u7530\u79c0\u4e4b:\u300c\u30ed\u30dc\u30c3\u30c8\u306b\u3088\u308b\u8074\u899a\u969c\u304c\u3044\u8005\u306e\u767a\u8a71\u52d5\u4f5c\u306e\u518d\u73fe\u300d,<br \/>\n\u96fb\u5b50\u60c5\u5831\u901a\u4fe1\u5b66\u4f1a HCG\u30b7\u30f3\u30dd\u30b8\u30a6\u30e02011, PP.196-201, 2011<br \/>\n[20] \u5b87\u9ad8\u9759, \u6fa4\u7530\u79c0\u4e4b:\u300c\u4eba\u306e\u97f3\u97ff\u8074\u53d6\u3092\u6a21\u5023\u3057\u305f\u7279\u5b9a\u97f3\u97ff\u306e\u80fd\u52d5\u7684\u30bb\u30f3\u30b7\u30f3\u30b0\u300d,<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u7b2c12\u56de\u30b7\u30b9\u30c6\u30e0\u30a4\u30f3\u30c6\u30b0\u30ec\u30fc\u30b7\u30e7\u30f3\u90e8\u9580\u8b1b\u6f14\u4f1a(SI2011)\u30012011\u5e7412\u670823\u65e5\uff5e25\u65e5, pp. 878-881, 2011<br \/>\n[21] \u5965\u672c\u771f\u6cbb, \u59dc\u9577\u5b89, \u8d99\u92d2, \u6fa4\u7530\u79c0\u4e4b:\u300c\u89e6\u899a\u30b0\u30ed\u30fc\u30d6\u306b\u3088\u308b\u4eee\u60f3\u30ad\u30e3\u30e9\u30af\u30bf\u3068\u306e\u30b8\u30a7\u30b9\u30c1\u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f3\u300d,<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u7b2c12\u56de\u30b7\u30b9\u30c6\u30e0\u30a4\u30f3\u30c6\u30b0\u30ec\u30fc\u30b7\u30e7\u30f3\u90e8\u9580\u8b1b\u6f14\u4f1a(SI2011)\u30012011\u5e7412\u670823\u65e5\uff5e25\u65e5, pp. 2232-2235, 2011<br \/>\n[22]\u00a0<span class=\"style8\"><a href=\"http:\/\/www.ieee-jp.org\/section\/shikoku\/JointConventionAward\/index.html\">Outstanding English Presentation Award<\/a><\/span>, Mitsuki Kitani and Hideyuki Sawada: \u201cAnalysis of Robotic Speech employing the Vocal Tract Area Function\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2011\u5e749\u670823\u65e5(\u91d1)<br \/>\n[23] Rosdiyana Samad and Hideyuki Sawada: \u201cCompressed Gabor Features and KNN Classification for Facial Expression Recognition\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2011\u5e749\u670823\u65e5(\u91d1)<br \/>\n[24] Chika Udaka and Hideyuki Sawada: \u201cEstimating and Tracking Different Sounds by a Robotic Auditory System\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2011\u5e749\u670823\u65e5(\u91d1)<br \/>\n[25] Yoichi Ishigami and Hideyuki Sawada: \u201cHowling Canceller Based on Travelling Time Difference between Two Microphones\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2011\u5e749\u670823\u65e5(\u91d1)<br \/>\n[26] Kazuya Okubo and Hideyuki Sawada: \u201cSpeech Communication with a Taking Robot employing a Chatter Robot\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, 2011\u5e749\u670823\u65e5(\u91d1)<br \/>\n<\/span><\/p>\n<p class=\"style3\"><span class=\"style4\"><strong>2010\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/strong><br \/>\n<\/span><span class=\"style6\">\u3000 [1]\u00a0<a href=\"http:\/\/www.ipsj.or.jp\/01kyotsu\/award\/ronbun_sho\/h21_detail.html\"><span class=\"style10\">\u5e73\u621021\u5e74\u5ea6 \u60c5\u5831\u51e6\u7406\u5b66\u4f1a\u8ad6\u6587\u8cde (2010\u5e745\u670831\u65e5\u53d7\u8cde)<\/span>\u00a0<\/a>,<br \/>\n\u6c34\u4e0a\u967d\u4ecb, \u6fa4\u7530\u79c0\u4e4b:\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u7cf8\u3092\u7528\u3044\u305f\u89e6\u899a\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u3068\u5fae\u5c0f\u632f\u52d5\u306e\u767a\u751f\u78ba\u7387\u5bc6\u5ea6\u5236\u5fa1\u306b\u3088\u308b\u89e6\u899a\u611f\u899a\u306e\u5448\u793a\u300d,<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a\u8ad6\u6587\u8a8c Vol.49, No.12, pp.3890-3898, 2008<br \/>\n[2]\u00a0<a href=\"http:\/\/hsi.wsiz.rzeszow.pl\/kenote_speakers.php\"><span class=\"style10\">Keynote Speech<\/span><\/a>, Hideyuki Sawada: \u201cDisplaying Tactile sensations and the perspectives of multimodal interface\u201d,<br \/>\n<a href=\"http:\/\/hsi.wsiz.rzeszow.pl\/\">3rd International Conference on Human System Interaction<\/a>, Rzeszow, Poland, Saturday, May 15th, 2010<br \/>\n[3]\u00a0<span class=\"style8\">Best Paper Award<\/span>\u00a0in the area of Human Sensory Factors and their applications,<br \/>\nMitsuki Kitani, Tatsuya Hara, Hiroki Hanada and Hideyuki Sawada: \u201cA talking robot for the vocal communication by the mimicry of human voice\u201d,<br \/>\nInternational Conference on Human System Interaction, pp. 728-733, 2010<br \/>\n[4] Hideyuki Sawada and Atsushi Todo: \u201cIntegration of Sound and Image Information for Active Tracking of Particular Person\u201d,<br \/>\nSeventh Annual International Conference on Electrical Engineering\/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON 2010), pp. 861-865, 2010<br \/>\n[5] Mitsuki Kitani, Tatusya Hara, Hiroki Hanada and Hideyuki Sawada: \u201cRobotic Vocalization Training System for the Auditory-impaired\u201d,<br \/>\n<a href=\"http:\/\/www.icdvrat.rdg.ac.uk\/\">International Conference on Disability, Virtual Reality and Associated Technologies<\/a>, pp.263-272, 2010<br \/>\n[6] Francois Gatto, Eric Benoit and Hideyuki Sawada: \u201cInformation fusion merging for person recognition and localization\u201d,<br \/>\n<a href=\"http:\/\/eam2010.sd.keio.ac.jp\/\">The 8th France-Japan and 6th Europe-Asia Congress on Mechatronics<\/a>, pp.157-162, 2010<br \/>\n[7] \u9ad8\u702c\u88d5\u53f2, \u59dc\u9577\u5b89, \u6fa4\u7530\u79c0\u4e4b:\u300c\u62e1\u5f35\u73fe\u5b9f\u7a7a\u9593\u306b\u304a\u3051\u308b\u89e6\u611f\u899a\u5448\u793a\u306b\u3088\u308b\u4eee\u60f3\u30ad\u30e3\u30e9\u30af\u30bf\u3068\u306e\u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f3\u30b7\u30b9\u30c6\u30e0\u300d,<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f32011 \u8ad6\u6587\u96c6, pp. 83-90, 2011<br \/>\n[8] \u6fa4\u7530\u79c0\u4e4b:\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u7cf8\u306e\u5fae\u5c0f\u632f\u52d5\u3092\u5229\u7528\u3057\u305f\u89e6\u899a\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u300d,<br \/>\n\u30b1\u30df\u30ab\u30eb\u30a8\u30f3\u30b8\u30cb\u30a2\u30ea\u30f3\u30b0, Vol.56, No.1, pp.21-27, 2011<br \/>\n[9] \u6fa4\u7530\u79c0\u4e4b:\u300c\u89e6\u899a\u5448\u793a\u7814\u7a76\u3092\u901a\u3057\u3066\u898b\u3048\u305f\u89e6\u899a\u306e\u96e3\u3057\u3055\u3068\u5c06\u6765\u6027\u300d,<br \/>\n\u60c5\u5831\u51e6\u7406, Vol.51, No.7, p.881, 2010<br \/>\n[10]\u00a0<a href=\"http:\/\/www.science-t.com\/book\/A061.htm\">\u89e6\u899a\u8a8d\u8b58\u30e1\u30ab\u30cb\u30ba\u30e0\u3068\u5fdc\u7528\u6280\u8853 \uff0d\u89e6\u899a\u30bb\u30f3\u30b5\u30fb\u89e6\u899a\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\uff0d<\/a>, \u30b5\u30a4\u30a8\u30f3\u30b9&amp;\u30c6\u30af\u30ce\u30ed\u30b8\u30fc, 2010\u5e749\u670829\u65e5 \u767a\u884c, ISBN978-4-86428-001-3 C3058, \u300c\u5fae\u5c0f\u632f\u52d5\u5b50\u30a2\u30ec\u30a4\u3092\u7528\u3044\u305f\u89e6\u899a\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u3068\u89e6\u611f\u899a\u306e\u63d0\u793a\u300d\u3092\u57f7\u7b46<br \/>\n[11]\u00a0<span class=\"style10\">\u65e5\u672c\u6a5f\u68b0\u5b66\u4f1a \u8b1b\u7fd2\u4f1a<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5fae\u5c0f\u632f\u52d5\u5b50\u30a2\u30ec\u30a4\u306b\u3088\u308b\u89e6\u899a\u306e\u5e7b\u899a\u751f\u8d77\u3068\u9ad8\u6b21\u77e5\u899a\u306b\u95a2\u3059\u308b\u8af8\u73fe\u8c61\u300d,<br \/>\n\u6a5f\u7d20\u6f64\u6ed1\u8a2d\u8a08\u90e8\u9580\u300c\u89e6\u899a\u6280\u8853\u306e\u57fa\u790e\u3068\u5fdc\u7528 \uff5e\u30d2\u30c8\u306e\u89e6\u899a\u7406\u89e3\u304b\u3089\u30d2\u30e5\u30fc\u30de\u30f3\u30de\u30b7\u30f3\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u3084\u30ed\u30dc\u30c3\u30c8\u3078\u306e\u5fdc\u7528\u307e\u3067\uff5e\u300d, \u6771\u4eac\u5927\u5b66, 2010\u5e747\u670823\u65e5(\u91d1)<br \/>\n[12]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u30ef\u30a4\u30e4\u306e\u5fae\u5c0f\u632f\u52d5\u3092\u5229\u7528\u3057\u305f\u89e6\u899a\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u300d\u3001<br \/>\n\u30a4\u30ce\u30d9\u30fc\u30b7\u30e7\u30f3\u30fb\u30b8\u30e3\u30d1\u30f32010-\u5927\u5b66\u898b\u672c\u5e02 \u65b0\u6280\u8853\u8aac\u660e\u4f1a\u3001\u6771\u4eac\u56fd\u969b\u30d5\u30a9\u30fc\u30e9\u30e0\u30012010\u5e749\u670829\u65e5(\u6c34)<br \/>\n[13]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u30ef\u30a4\u30e4\u3092\u7528\u3044\u305f\u643a\u5e2f\u578b\u70b9\u5b57\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u306e\u958b\u767a\u300d\u3001<br \/>\n\u7b2c4\u56de\u5927\u5b66\u7b49\u306e\u7814\u7a76\u30b7\u30fc\u30ba\u767a\u8868\u4f1a\uff0d\u9999\u5ddd\u306e\u6700\u5148\u7aef\u3082\u306e\u3065\u304f\u308a\u6280\u8853\uff0d\u3001\u30b5\u30f3\u30e1\u30c3\u30bb\u9999\u5ddd\u3001JST\u30a4\u30ce\u30d9\u30fc\u30b7\u30e7\u30f3\u30b5\u30c6\u30e9\u30a4\u30c8\u5fb3\u5cf6\u30012010\u5e7412\u67089\u65e5(\u6728)<br \/>\n[14]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u30ef\u30a4\u30e4\u306e\u5fae\u5c0f\u632f\u52d5\u3092\u5229\u7528\u3057\u305f\u89e6\u899a\u5448\u793a\u300d\u3001<br \/>\n\u79d1\u5b66\u6280\u8853\u632f\u8208\u6a5f\u69cb(JST)\u3064\u306a\u3050\u3057\u304f\u307f\u65b0\u6280\u8853\u8aac\u660e\u4f1a\u3001JST\u30db\u30fc\u30eb\u30012011\u5e741\u670813\u65e5(\u6728)<br \/>\n[15]\u00a0<span class=\"style10\">Invited Lecture<\/span>, Hideyuki Sawada: \u201cComputational intelligence and the applications to human-robot interactions\u201d,<br \/>\nComputer Engineering Department, Chiang Mai University, Thailand, Wednesday 30th June, 2010<br \/>\n[16] Hideyuki Sawada and Seiji Hata: \u201cMechatronics technologies for supporting handicapped and aged people\u201d,<br \/>\n<a href=\"http:\/\/rac.oop.cmu.ac.th\/cmuku\/\">3nd Joint Symposium between Chiang Mai University and Kagawa University<\/a>, p.109-110, 2010<br \/>\n[17] Nattapong Swangmuang, Kasemsak Uthaichana, Hideyuki Sawada and Nipon Theera-Umpon: \u201cAcoustic-based signal transmission for mangosteen internal quality measures\u201d,<br \/>\n<a href=\"http:\/\/rac.oop.cmu.ac.th\/cmuku\/\">3nd Joint Symposium between Chiang Mai University and Kagawa University<\/a>, p.32-34, 2010<br \/>\n[18]\u00a0<span class=\"style8\"><a href=\"http:\/\/www.ele.kochi-tech.ac.jp\/ieeeshikoku\/presentation_award_10\/student_award2010.html\">IEEE Best Presentation Award<\/a><\/span>, Rosdiyana Samad and Hideyuki Sawada: \u201cComparison of Two Classifiers on Simple Gabor Features for Facial Expression Recognition\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.300, 2010\u5e749\u670825\u65e5(\u571f)<br \/>\n[19]\u00a0<span class=\"style8\"><a href=\"http:\/\/www.ele.kochi-tech.ac.jp\/ieeeshikoku\/presentation_award_10\/student_award2010.html\">IEEE Best Presentation Award<\/a><\/span>, Muhamad Hafiz and Hideyuki Sawada: \u201cEmpirical Study of Button Pressing Feedback Sensation Display on Touch Screen\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.301, 2010\u5e749\u670825\u65e5(\u571f)<br \/>\n[20]\u00a0<span class=\"style8\"><a href=\"http:\/\/www.ele.kochi-tech.ac.jp\/ieeeshikoku\/presentation_award_10\/student_award2010.html\">IEEE Best Presentation Award<\/a><\/span>, Hirofumi Takase and Hideyuki Sawada: \u201cA Tactile Glove for Gestural Interface with Tactile Feedback\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.302, 2010\u5e749\u670825\u65e5(\u571f)<br \/>\n[21]\u00a0<span class=\"style8\">\u512a\u79c0\u767a\u8868\u8cde<\/span>, Chika Udaka and Hideyuki Sawada: \u201cEstimation of Sound Source Direction using a Robotic Arm\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.334, 2010\u5e749\u670825\u65e5(\u571f)<br \/>\n[22] Mitsuki Kitani and Hideyuki Sawada: \u201cConstruction of an Interactive Training System Using a Talking Robot\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.335, 2010\u5e749\u670825\u65e5(\u571f)<br \/>\n[23]\u00a0<span class=\"style8\">\u5b66\u751f\u5968\u52b1\u8cde<\/span>, \u5e73\u4e95\u512a\u53f8\u3001\u9577\u4e95\u7f8e\u548c\u3001\u6a2a\u4e95\u82f1\u4eba\u3001\u6fa4\u7530\u79c0\u4e4b:\u300c\u30aa\u30f3\u30c8\u30ed\u30b8\u30fc\u306b\u5bfe\u5fdc\u3057\u305f\u97f3\u58f0\u8a8d\u8b58\u306b\u3088\u308b\u96fb\u5b50\u30ab\u30eb\u30c6\u5165\u529b\u652f\u63f4\u30b7\u30b9\u30c6\u30e0\u300d,<br \/>\n<a href=\"http:\/\/jcmi2010.e-rad.jp\/\">\u7b2c30\u56de\u533b\u7642\u60c5\u5831\u5b66\u9023\u5408\u5927\u4f1a<\/a>\u00a0\u8b1b\u6f14\u8ad6\u6587\u96c6, pp.346-349, 2010\u5e7411\u670821\u65e5(\u65e5)<br \/>\n[24]\u00a0<span class=\"style8\">IEEE Best Award<\/span>, Mitsuki Kitani, Tatsuya Hara, Hiroki Hanada and Hideyuki Sawada: \u201cA Talking Robot and the Auditory Feedback Learning for Natural Vocalization\u201d,<br \/>\n<a href=\"http:\/\/www.young-researchers.net\/tyrw7th\/index.html\">The 7th IEEE Tokyo Young Researchers Workshop<\/a>, p.25, 2010\u5e7411\u670820\u65e5(\u571f)<br \/>\n[25] Rosdiyana Samad and Hideyuki Sawada: \u201cCombination of Edge Operators in the Feature Extraction for Facial Expression Recognition\u201d,<br \/>\n<a href=\"http:\/\/www.young-researchers.net\/tyrw7th\/index.html\">The 7th IEEE Tokyo Young Researchers Workshop<\/a>, p.25, 2010\u5e7411\u670820\u65e5(\u571f)<br \/>\n<\/span><\/p>\n<p class=\"style3\"><span class=\"style4\"><strong>2009\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/strong><br \/>\n<\/span><span class=\"style6\">\u3000 [1] Khairunizam Wan and Hideyuki Sawada: \u201cGesture Recognition based on the Probability Distribution of Arm Trajectories\u201d,<br \/>\nSICE Journal of Control, Measurement and System Integration, Vol.2, No.5, pp.263-p.270, 2009<br \/>\n[2]\u00a0<span class=\"style10\">Invited Talk<\/span>, Hideyuki Sawada and Mitsuki Kitan: \u201cA Talking Robot and the Interactive Speech Training for Hearing-Impaired\u201d,<br \/>\nGeneral Meeting of the Acoustical Society of America, May 18th, 2009<br \/>\n[3]\u00a0<span class=\"style8\">Best Paper Award<\/span>\u00a0in the area of Human Factors for Human-System Interaction,<br \/>\nKeishi Fukuyama, Naoki Takahashi, Feng Zhao, and Hideyuki Sawada: \u201cTactile Display Using the Vibration of SMA Wires and the Evaluation of Perceived Sensations\u201d,<br \/>\nIEEE International Conference on Human System Interaction, pp. 685-690, 2009<br \/>\n[4]\u00a0<span class=\"style8\">Student Travel Grant Award<\/span>,<br \/>\nHirofumi Takase and Hideyuki Sawada: \u201cGestural Interface and the Intuitive Interaction with Virtual Objects\u201d,<br \/>\nICROS-SICE International Joint Conference 2009, pp. 3260-3263, August 18-21, 2009<br \/>\n[5] Hideyuki Sawada and Seiji Hata: \u201cSensing and monitoring technologies for agricultural applications\u201d,<br \/>\nCMU\/KU Symposium on Food Safety Technologies in South East Asia, The 2nd International Meeting for Development of International Network for Reduction of Agrochemical Use, September 22-23, 2009<br \/>\n[6] Feng Zhao, Keishi Fukuyama and Hideyuki Sawada: \u201cCompact Braille display using SMA wire array\u201d,<br \/>\nThe 18th IEEE International Symposium on Robot and Human Interactive Communication, pp. 28-33, Toyama, Japan, Sept. 27-Oct. 2, 2009<br \/>\n[7] Mitsuki Kitani, Tatsuya Hara and Hideyuki Sawada: \u201cA Talking Robot and the Adaptive Learning of Speech Articulation Using 3D SOM\u201d,<br \/>\nIEEE International Symposium on Biomedical Engineering (IEEE ISBME2009), CD-ROM proceedings WM1-1 #1042, December 14th \u2013 18th, 2009<br \/>\n[8]\u00a0<span class=\"style8\">Young Author Award<\/span>,<br \/>\nRosdiyana Samad and Hideyuki Sawada: \u201cA Study of Dimension Reduction of Gabor Features from Different Facial Expressions\u201d,<br \/>\nInternational Symposium on Artificial Life and Robotics, CD-ROM proceedings #159, February 4th \u2013 6th, 2010<br \/>\n[9]\u00a0<a href=\"http:\/\/www.nano-opt.jp\/press\/robot_handbook.pdf\">\u30ed\u30dc\u30c3\u30c8\u60c5\u5831\u5b66\u30cf\u30f3\u30c9\u30d6\u30c3\u30af<\/a>, \u30ca\u30ce\u30aa\u30d7\u30c8\u30cb\u30af\u30b9\u30fb\u30a8\u30ca\u30b8\u30fc\u51fa\u7248\u5c40, 2010\u5e743\u670819\u65e5 \u767a\u884c, ISBN 978-4-7649-5507-3 C3040, \u300c\u97f3\u58f0\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u300d\u3092\u57f7\u7b46<br \/>\n[10]\u00a0<span class=\"style8\">Keynote Talk<\/span>,<br \/>\nHideyuki Sawada: \u201cSystem intelligence and the applications to human-system interaction\u201d,<br \/>\nKagawa University-University of British Columbia Joint Workshop on Human-Machine Interactive System, February 22nd, 2010<br \/>\n[11] \u798f\u5c71\u60e0\u58eb\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u30c6\u30af\u30b9\u30c1\u30e3\u611f\u899a\u5448\u793a\u30b7\u30b9\u30c6\u30e0\u306e\u8a66\u4f5c\u3068\u8996\u899a\u523a\u6fc0\u306e\u89e6\u899a\u77e5\u899a\u3078\u306e\u5f71\u97ff\u306e\u8a55\u4fa1\u300d,<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f32010 \u8ad6\u6587\u96c6, pp. 247-250, 2010<br \/>\n[12]\u00a0<span class=\"style10\">\u65e5\u672c\u6a5f\u68b0\u5b66\u4f1a \u8b1b\u7fd2\u4f1a<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5fae\u5c0f\u632f\u52d5\u5b50\u30a2\u30ec\u30a4\u306b\u3088\u308b\u89e6\u899a\u306e\u5e7b\u899a\u751f\u8d77\u3068\u9ad8\u6b21\u77e5\u899a\u306b\u95a2\u3059\u308b\u8af8\u73fe\u8c61\u300d,<br \/>\n\u6a5f\u7d20\u6f64\u6ed1\u8a2d\u8a08\u90e8\u9580\u300c\u89e6\u899a\u6280\u8853\u306e\u57fa\u790e\u3068\u5fdc\u7528 \uff5e\u30d2\u30c8\u306e\u89e6\u899a\u7406\u89e3\u304b\u3089\u30d2\u30e5\u30fc\u30de\u30f3\u30de\u30b7\u30f3\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u3084\u30ed\u30dc\u30c3\u30c8\u3078\u306e\u5fdc\u7528\u307e\u3067\uff5e\u300d, \u5927\u962a\u5927\u5b66, 2009\u5e747\u670824\u65e5(\u91d1)<br \/>\n[13]\u00a0<span class=\"style10\">\u65e5\u672c\u6a5f\u68b0\u5b66\u4f1a \u8b1b\u7fd2\u4f1a<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5fae\u5c0f\u632f\u52d5\u5b50\u30a2\u30ec\u30a4\u306b\u3088\u308b\u89e6\u899a\u306e\u5e7b\u899a\u751f\u8d77\u3068\u9ad8\u6b21\u77e5\u899a\u306b\u95a2\u3059\u308b\u8af8\u73fe\u8c61\u300d,<br \/>\n\u6a5f\u7d20\u6f64\u6ed1\u8a2d\u8a08\u90e8\u9580\u300c\u89e6\u899a\u6280\u8853\u306e\u57fa\u790e\u3068\u5fdc\u7528 \uff5e\u30d2\u30c8\u306e\u89e6\u899a\u7406\u89e3\u304b\u3089\u30d2\u30e5\u30fc\u30de\u30f3\u30de\u30b7\u30f3\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u3084\u30ed\u30dc\u30c3\u30c8\u3078\u306e\u5fdc\u7528\u307e\u3067\uff5e\u300d, \u540d\u53e4\u5c4b\u5927\u5b66, 2009\u5e747\u670831\u65e5(\u91d1)<br \/>\n[14]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u7cf8\u306e\u5fae\u5c0f\u632f\u52d5\u3092\u5229\u7528\u3057\u305f\u89e6\u899a\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u300d\u3001<br \/>\n\u79d1\u5b66\u6280\u8853\u632f\u8208\u6a5f\u69cb(JST) \u65b0\u6280\u8853\u8aac\u660e\u4f1a\u3001JST\u30db\u30fc\u30eb\u30012009\u5e744\u67083\u65e5(\u91d1)<br \/>\n[15]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u30ef\u30a4\u30e4\u306e\u5fae\u5c0f\u632f\u52d5\u3092\u5229\u7528\u3057\u305f\u89e6\u899a\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u300d\u3001<br \/>\n\u30a4\u30ce\u30d9\u30fc\u30b7\u30e7\u30f3\u30fb\u30b8\u30e3\u30d1\u30f32009-\u5927\u5b66\u898b\u672c\u5e02 \u65b0\u6280\u8853\u8aac\u660e\u4f1a\u3001\u6771\u4eac\u56fd\u969b\u30d5\u30a9\u30fc\u30e9\u30e0\u30012009\u5e749\u670818\u65e5(\u91d1)<br \/>\n[16]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5fae\u5c0f\u632f\u52d5\u5b50\u30a2\u30ec\u30a4\u3092\u7528\u3044\u305f\u89e6\u899a\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u3068\u89e6\u611f\u899a\u5448\u793a\u300d\u3001<br \/>\n\u6280\u8853\u60c5\u5831\u5354\u4f1a\u30012009\u5e7412\u670821\u65e5(\u6708)<br \/>\n[17] \u798f\u5c71\u60e0\u58eb\u3001\u758b\u7530\u7ae0\u535a\u3001\u8d99\u92d2\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5fae\u5c0f\u632f\u52d5\u5b50\u30a2\u30ec\u30a4\u306b\u3088\u308b\u89e6\u899a\u5448\u793a\u3068\u30c6\u30af\u30b9\u30c1\u30e3\u611f\u899a\u306e\u8a55\u4fa1\u300d\u3001<br \/>\n\u96fb\u6c17\u5b66\u4f1a \u60c5\u5831\u51e6\u7406\u6280\u8853\u59d4\u54e1\u4f1a,\u7523\u696d\u30b7\u30b9\u30c6\u30e0\u60c5\u5831\u5316\u6280\u8853\u59d4\u54e1\u4f1a\u3000\u5408\u540c\u7814\u7a76\u4f1a, \u5e73\u621021\u5e7410\u67089\u65e5(\u91d1), pp. 7-12, 2009<br \/>\n[18] \u798f\u5c71\u60e0\u58eb\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u8996\u899a\u3068\u89e6\u899a\u306e\u540c\u6642\u523a\u6fc0\u306b\u3088\u308b\u80fd\u52d5\u89e6\u6642\u306e\u30c6\u30af\u30b9\u30c1\u30e3\u611f\u899a\u5448\u793a\u30b7\u30b9\u30c6\u30e0\u306e\u8a66\u4f5c\u3068\u8a55\u4fa1\u300d\u3001<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u7b2c10\u56de\u30b7\u30b9\u30c6\u30e0\u30a4\u30f3\u30c6\u30b0\u30ec\u30fc\u30b7\u30e7\u30f3\u90e8\u9580\u8b1b\u6f14\u4f1a(SI2009)\u30012009\u5e7412\u670824\u65e5\uff5e26\u65e5, pp. 157-160, 2009<br \/>\n[19] \u6fa4\u7530\u79c0\u4e4b\u3001\u6771\u85e4\u7be4\u53f2\uff1a\u300c\u97f3\u97ff\u4fe1\u53f7\u3068\u753b\u50cf\u4fe1\u53f7\u306e\u7d71\u5408\u51e6\u7406\u306b\u3088\u308b\u7279\u5b9a\u97f3\u97ff\u306e\u80fd\u52d5\u7684\u30bb\u30f3\u30b7\u30f3\u30b0\u300d\u3001<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u7b2c10\u56de\u30b7\u30b9\u30c6\u30e0\u30a4\u30f3\u30c6\u30b0\u30ec\u30fc\u30b7\u30e7\u30f3\u90e8\u9580\u8b1b\u6f14\u4f1a(SI2009)\u30012009\u5e7412\u670824\u65e5\uff5e26\u65e5, pp. 544-547, 2009<br \/>\n[20]\u00a0<span class=\"style8\">IEEE Best Presentation Award<\/span>, Keishi Fukuyama and Hideyuki Sawada: \u201cExperimental Study of Texture Presentation by the Tactile and Visual Stimuli\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.352, 2009\u5e749\u670826\u65e5(\u571f)<br \/>\n[21] Takafumi Ohara and Hideyuki Sawada: \u201cExtraction of Insects and Their Faces in a Nature Photo\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.327, 2009\u5e749\u670826\u65e5(\u571f)<br \/>\n[22] Mitsuki Kitani and Hideyuki Sawada: \u201cAdaptive Learning of the Speech Articulation of a Talking Robot Using 3D SOM\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.353, 2009\u5e749\u670826\u65e5(\u571f)<br \/>\n[23] Hirofumi Takase and Hideyuki Sawada: \u201cGestural Interface with Visual and Tactile Feedback in VR Space\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.354, 2009\u5e749\u670826\u65e5(\u571f)<br \/>\n[24] Akihiro Hikida and Hideyuki Sawada: \u201cObservation of the micro-vibration of a SMA wire on a human skin\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.355, 2009\u5e749\u670826\u65e5(\u571f)<br \/>\n[25] Yuji Hirai, Hideto Yokoi and Hideyuki Sawada: \u201cVoice Recognition Tool to Input Electronic Medical Records having Ontological Structure\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.356, 2009\u5e749\u670826\u65e5(\u571f)<br \/>\n<\/span><\/p>\n<p class=\"style3\"><span class=\"style4\"><strong>2008\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/strong><br \/>\n<\/span><span class=\"style6\">\u3000 [1] \u6c34\u4e0a\u967d\u4ecb\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u7cf8\u3092\u7528\u3044\u305f\u89e6\u899a\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u3068\u5fae\u5c0f\u632f\u52d5\u306e\u767a\u751f\u78ba\u7387\u5bc6\u5ea6\u5236\u5fa1\u306b\u3088\u308b\u89e6\u899a\u611f\u899a\u306e\u5448\u793a\u300d,<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a\u8ad6\u6587\u8a8c Vol.49, No.12, pp.3890-3898, 2008<br \/>\n[2] Hideyuki Sawada, Mitsuki Kitani and Yasumori Hayashi: \u201cA Robotic Voice Simulator and the Interactive Training for Hearing-Impaired People\u201d,<br \/>\n<a href=\"http:\/\/www.hindawi.com\/journals\/jbb\/volume-2008\/si.1.html\">Journal of Biomedicine and Biotechnology<\/a>, Volume 2008, Article ID 768232, 2008<br \/>\n[3] Mitsuki Kitani, Yasumori Hayashi and Hideyuki Sawada: \u201cInteractive training of speech articulation for hearing impaired using a talking robot\u201d,<br \/>\n<a href=\"http:\/\/www.icdvrat.reading.ac.uk\/\">International Conference on Disability, Virtual Reality and Associated Technologies<\/a>, pp.293-301, 2008<br \/>\n[4] Keishi Fukuyama, Yohsuke Mizukami and Hideyuki Sawada: \u201cA Novel Micro-vibration Actuator and the Presentation of Tactile Sensations\u201d,<br \/>\nProceedings of the 12th IMEKO TC1 &amp; TC7 Joint Symposium on Man Science &amp; Measurement, pp. 141-146, 2008<br \/>\n[5] Khairunizam Wan and Hideyuki Sawada: \u201cDynamic Gesture Recognition Based on the Probabilistic Distribution of Arm Trajectory\u201d,<br \/>\nIEEE International Conference on Mechatronics and Automation (ICMA 2008), pp. , 2008<br \/>\n[6] Mitsuki Kitani, Yasumori Hayashi, and Hideyuki Sawada: \u201cA Robotic Voice Simulator and the Articulatory Reproduction of Impaired Voices\u201d,<br \/>\nMecatronics2008, Paper #203, 2008<br \/>\n[7] Khairunizam Wan, Atsushi Todo, Hideyuki Sawada, Olivier Passalacqua, Eric Benoit, Marc-Philippe Huget, Patrice Moreaux: \u201cVideo conference smart room: an information fusion system based on distributed sensors\u201d,<br \/>\nMecatronics2008, Paper #212, 2008<br \/>\n[8]\u00a0<a href=\"http:\/\/www.baifukan.co.jp\/sinkan\/shokai\/067687.html\">\u6b21\u4e16\u4ee3\u30bb\u30f3\u30b5\u30cf\u30f3\u30c9\u30d6\u30c3\u30af<\/a>, \u57f9\u98a8\u9928, 2008\u5e747\u67088\u65e5 \u767a\u884c, ISBN 978-4-563-06768-7, \u300c\u89e6\u899a\u30bb\u30f3\u30b5\u300d\u3092\u57f7\u7b46<br \/>\n[9]\u00a0<span class=\"style10\">\u62db\u5f85\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5fae\u5c0f\u632f\u52d5\u523a\u6fc0\u306e\u932f\u899a\u3092\u7528\u3044\u305f\u89e6\u899a\u5448\u793a\u30b7\u30b9\u30c6\u30e0\u3068\u305d\u306e\u5c55\u671b\uff5e\u958b\u767a\u7af6\u4e89\u6fc0\u3057\u3044\u660e\u65e5\u306e\u30bf\u30c3\u30c1\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u6280\u8853\uff5e\u300d,<br \/>\n(\u8ca1)\u95a2\u897f\u60c5\u5831\uff65\u7523\u696d\u6d3b\u6027\u5316\u30bb\u30f3\u30bf\u30fc\u300c\u60c5\u5831\u5bb6\u96fb\u30d3\u30b8\u30cd\u30b9\u30b7\u30fc\u30ba\u7814\u7a76\u4f1a\u300d, 2008\u5e747\u670811\u65e5<br \/>\n[10]\u00a0<span class=\"style10\">Invited Talk<\/span>, Hideyuki Sawada: \u201cHaptic Device for Contact Exploration\u201d,<br \/>\nin Japan \u2013 U.S. Joint Research Program \u201cContact Interface Modeling and Stiffness-based Biomedical Diagnosis with Sensing Technology Towards a Better Quality of Life\u201d, Saturday, Jun 14th, 2008<br \/>\n[11]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u77e5\u7684\u60c5\u5831\u51e6\u7406\u30fb\u77e5\u80fd\u30ed\u30dc\u30c6\u30a3\u30af\u30b9\u306e\u000b\u73fe\u72b6\u3068\u5c06\u6765\u5c55\u671b\u300d,<br \/>\n<a href=\"http:\/\/www.ssken.co.jp\/\">\u56db\u56fd\u7dcf\u5408\u7814\u7a76\u6240<\/a>\u00a0\u7dcf\u7814\u30bb\u30df\u30ca\u30fc, 2008\u5e745\u67089\u65e5(\u571f)<br \/>\n[12]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u8996\u899a\u30fb\u8074\u899a\u30fb\u89e6\u899a\u306b\u3088\u308b\u30b3\u30df\u30e5\u30cb\u30b1\u30fc\u30b7\u30e7\u30f3\u3068\u7d71\u5408\u30e1\u30c7\u30a3\u30a2\u6280\u8853\u300d,<br \/>\n\u5927\u962a\u96fb\u6c17\u901a\u4fe1\u5927\u5b66 \u8996\u899a\u60c5\u5831\u57fa\u790e\u7814\u7a76\u65bd\u8a2d(VIRI)\u5b66\u8853\u8b1b\u6f14\u4f1a \uff0d\u8996\u899a\u304b\u3089\u4e94\u611f\u3078\u3068\u5e83\u304c\u308b\u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f3\u6280\u8853\uff0d, 2008\u5e7412\u67085\u65e5(\u91d1)<br \/>\n[13]\u00a0<span class=\"style10\">\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u30ed\u30dc\u30c3\u30c8\u5de5\u5b66\u3068\u6b21\u4e16\u4ee3\u30e1\u30c7\u30a3\u30a2\u7d71\u5408\u6280\u8853\u306e\u5c55\u671b \uff5e\u30cb\u30c3\u30dd\u30f3\u306e\u79d1\u5b66\u30fb\u6280\u8853\u3092\u62c5\u3046\u6b21\u306e\u82e5\u8005\u305f\u3061\u3078\uff5e\u300d,<br \/>\n<a href=\"http:\/\/www.qsj1984.com\/\">\u5b66\u7fd2\u587e\u30af\u30bb\u30b8\u30e5<\/a>\u8b1b\u6f14\u4f1a, 2009\u5e742\u670814\u65e5(\u571f)<br \/>\n[14]\u00a0<span class=\"style10\">\u65e5\u672c\u6a5f\u68b0\u5b66\u4f1a \u8b1b\u7fd2\u4f1a<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u7cf8\u306e\u5fae\u5c0f\u632f\u52d5\u5b50\u30a2\u30ec\u30a4\u306b\u3088\u308b\u89e6\u899a\u611f\u899a\u5448\u793a\u300d,<br \/>\n\u6a5f\u7d20\u6f64\u6ed1\u8a2d\u8a08\u90e8\u9580\u300c\u89e6\u899a\u6280\u8853\u306e\u57fa\u790e\u3068\u5fdc\u7528 \uff5e\u30d2\u30c8\u306e\u89e6\u899a\u7406\u89e3\u304b\u3089\u30d2\u30e5\u30fc\u30de\u30f3\u30de\u30b7\u30f3\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u3084\u30ed\u30dc\u30c3\u30c8\u3078\u306e\u5fdc\u7528\u307e\u3067\uff5e\u300d, 2008\u5e747\u670818\u65e5(\u91d1)<br \/>\n[15] Hideyuki Sawada: \u201cActive Tracking of Particular Person Using Visual and Auditory Information\u201d,<br \/>\n<a href=\"http:\/\/www.edurejs2008.kagawa-u.ac.jp\/\">2nd KU-CMU Joint Symposium<\/a>, p.21, 2008<br \/>\n[16] \u798f\u5c71\u60e0\u58eb\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u7cf8\u30a2\u30af\u30c1\u30e5\u30a8\u30fc\u30bf\u306b\u3088\u308a\u751f\u8d77\u3055\u308c\u308b\u5fae\u5c0f\u632f\u52d5\u306b\u3088\u308b\u30c6\u30af\u30b9\u30c1\u30e3\u611f\u899a\u5448\u793a\u300d,<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u7b2c9\u56de\u30b7\u30b9\u30c6\u30e0\u30a4\u30f3\u30c6\u30b0\u30ec\u30fc\u30b7\u30e7\u30f3\u90e8\u9580\u8b1b\u6f14\u4f1a(SI2008), pp.961-962, 2008<br \/>\n[17] \u6771\u85e4\u7be4\u53f2\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u30ed\u30dc\u30c3\u30c8\u30a2\u30fc\u30e0\u3092\u7528\u3044\u305f\u97f3\u5834\u306e\u80fd\u52d5\u30bb\u30f3\u30b7\u30f3\u30b0\u300d,<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u7b2c9\u56de\u30b7\u30b9\u30c6\u30e0\u30a4\u30f3\u30c6\u30b0\u30ec\u30fc\u30b7\u30e7\u30f3\u90e8\u9580\u8b1b\u6f14\u4f1a(SI2008), pp.657-658, 2008<br \/>\n[18] Hirofumi Takase and Hideyuki Sawada, \u201cImage-based Gesture Recognition for Intuitive User Interface\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.228, 2008\u5e749\u670827\u65e5(\u571f)<br \/>\n[19] \u8d99\u92d2\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u306e\u632f\u52d5\u3092\u5229\u7528\u3057\u305f\u70b9\u5b57\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u300d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.323, 2008\u5e749\u670827\u65e5(\u571f)<br \/>\n[20]\u00a0<span class=\"style8\">IEEE Best Presentation Award<\/span>, Mitsuki Kitani and Hideyuki Sawada: \u201cProposal of 3D SOM for Autonomous Voice Acquisition of a Talking Robot\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.370, 2008\u5e749\u670827\u65e5(\u571f)<br \/>\n[21] Keishi Fukuyama and Hideyuki Sawada: \u201cMicro-vibration Actuators and the Presentation of Tactile Sensation to Human Skin\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.397, 2008\u5e749\u670827\u65e5(\u571f)<br \/>\n[22] Mustafa Mahfuzah and Hideyuki Sawada: \u201cHuman Motion Detection using Optical Flow\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.398, 2008\u5e749\u670827\u65e5(\u571f)<br \/>\n[23] Atsushi Todo and Hideyuki Sawada: \u201cActive sensing of sound location using a robotic arm\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.399, 2008\u5e749\u670827\u65e5(\u571f)<br \/>\n[24] Naoki Takahashi and Hideyuki Sawada: \u201cDevelopment of a tactile display and the presentation of alphabetic characters\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6, p.400, 2008\u5e749\u670827\u65e5(\u571f)<br \/>\n<\/span><\/p>\n<p class=\"style3\"><span class=\"style4\"><strong>2007\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/strong><br \/>\n<\/span><span class=\"style6\">\u3000 [1]\u00a0<span class=\"style8\">New Technology Foundation Award Finalist<\/span>, Hideyuki Sawada and Mitsuhiro Nakamura: \u201cMechanical Voice System and its Singing Performance\u201d,<br \/>\nInternational Conference on Intelligent Robots and Systems (IROS2007), November 2nd, 2007<br \/>\n[2]\u00a0<span class=\"style8\">\u30d2\u30e5\u30fc\u30de\u30f3\u30b3\u30df\u30e5\u30cb\u30b1\u30fc\u30b7\u30e7\u30f3\u8cde \u53d7\u8cde \uff08\u96fb\u5b50\u60c5\u5831\u901a\u4fe1\u5b66\u4f1a \u30d2\u30e5\u30fc\u30de\u30f3\u30b3\u30df\u30e5\u30cb\u30b1\u30fc\u30b7\u30e7\u30f3\u30b0\u30eb\u30fc\u30d7\uff09<\/span>\u3001<br \/>\n\u8ad6\u6587 \u300c\u767a\u8a71\u30ed\u30dc\u30c3\u30c8\u3092\u7528\u3044\u305f\u8074\u899a\u969c\u788d\u8005\u306e\u305f\u3081\u306e\u767a\u8a71\u8a13\u7df4\u300d, \u4fe1\u5b66\u6280\u5831 HCS2007-24, pp. 125-130, 2007\u5e743\u670823\u65e5<br \/>\n[3] \u6c34\u4e0a\u967d\u4ecb\u3001\u5185\u7530\u5553\u6cbb\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u7cf8\u72b6\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u306e\u632f\u52d5\u3092\u5229\u7528\u3057\u305f\u9ad8\u6b21\u77e5\u899a\u751f\u8d77\u306b\u3088\u308b\u89e6\u899a\u5448\u793a\u300d,<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a\u8ad6\u6587\u8a8c, Vol.48, No.12, pp. 3739-3749, 2007<br \/>\n[4] Hideyuki Sawada and Toshiya Takechi, \u201cA Robotic Auditory System that Interacts with Musical Sounds and Human Voices\u201d,<br \/>\nJournal of Advanced Computational Intelligence and Intelligent Informatics, Vol.11, No.10, pp.1177-1183, 2007<br \/>\n[5] Hideyuki Sawada and Minoru Ohkado, \u201cSensing of Particular Speakers for the Construction of Voice Interface Utilized in Noisy Environment\u201d,<br \/>\nJournal of Electrical Engineering in Japan, Vol.162, No.3, pp.78-86, February 2008<br \/>\n[6] Hideyuki Sawada, \u201cTalking Robot and the Autonomous Acquisition of Vocalization and Singing Skill \u201c,<br \/>\nChapter 22\u00a0<em>in<\/em>\u00a0<strong>Robust Speech Recognition and Understanding<\/strong>, Edited by Grimm and Kroschel, pp.385-404, June 2007,\u00a0<strong>ISBN<\/strong>: 978-3-902613-08-0<br \/>\n[7]\u00a0<span class=\"style10\">Invited Lecture<\/span>, Hideyuki Sawada, \u201cTactile devices and the tactile communication\u201d,<br \/>\nInternational Workshop on Tactile and Haptic Interaction, pp.8-19, May 28, 2007<br \/>\n[8]\u00a0<span class=\"style10\">\u62db\u5f85\u8b1b\u6f14<\/span>, \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u767a\u8a71\u30ed\u30dc\u30c3\u30c8\u306b\u3088\u308b\u767a\u8a71\u52d5\u4f5c\u306e\u518d\u73fe\u3068\u80a2\u4f53\u4e0d\u81ea\u7531\u5150\u306e\u767a\u8a71\u8a13\u7df4\u3078\u306e\u5fdc\u7528\u300d,<br \/>\n\u7b2c52\u56de\u897f\u65e5\u672c\u80a2\u4f53\u4e0d\u81ea\u7531\u5150\u65bd\u8a2d\u904b\u55b6\u7814\u7a76\u4f1a, 2007\u5e749\u670814\u65e5, 2007<br \/>\n[9]\u00a0<span class=\"style10\">Invited Lecture<\/span>, Hideyuki Sawada, \u201cEvolution of Talking Robot\u201d,<br \/>\nUniversiti Teknologi Malaysia, Malaysia, November 19, 2007<br \/>\n[10] Khairunizam Wan, Atsushi Todo, Eric Benoit and Hideyuki Sawada, \u201cAn Information Fusion System of Sensors for Human-machine Communication\u201d,<br \/>\nAnnual Conference of the IEEE Industrial Electronics Society (IECON2007), pp.239-244, 2007<br \/>\n[11] Hideyuki Sawada, Mitsuki Kitani and Yasumori Hayashi, \u201cArticulatory Reproduction of Voices of Hearing-impaired by a Talking Robot\u201d,<br \/>\nFrontiers in the Convergence of Bioscience and Information Technologies (FBIT2007), pp.603-608, 2007<br \/>\n[12] Khairunizam Wan and Hideyuki Sawada, \u201c3D Measurement of human upper body for gesture recognition\u201d,<br \/>\nInternational Symposium on Optomechatronic Technologies (ISOT2007), Vol. 6718 67180I-1\uff5e8, 2007<br \/>\n[13] Khairunizam Wan and Hideyuki Sawada, \u201cOptical Tracking for the Estimation of Human Upper Body Model and Its 3D Motion\u201d,<br \/>\n13th Microoptics Conference (MOC\u201907), pp. 284-285, 2007<br \/>\n[14] Takashi Anezaki, Seiji Hata and Hideyuki Sawada, \u201cInteractive User Interface for Visual Inspection System\u201d,<br \/>\nAnnual Conference of the IEEE Industrial Electronics Society (IECON2007), pp.222-227, 2007<br \/>\n[15] Khairunizam Wan and Hideyuki Sawada, \u201c3D Motion prediction of human upper body by tracking reflective markers on a moving body\u201d,<br \/>\nMalaysia Japan International Symposium on Advanced Technology (MJISAT2007), MJISAT-172, 2007<br \/>\n[16] Hideyuki Sawada, Yasumori Hayashi and Mitsuki Kitani, \u201cA Robotic Mechanical Voice Simulator to Train Auditory Impaired People\u201d,<br \/>\nMalaysia Japan International Symposium on Advanced Technology (MJISAT2007), MJISAT- 319_SS, 2007<br \/>\n[17] Mitsuki Kitan, Yasumori Hayashi and Hideyuki Sawada, \u201cA Talking Robot and the Reproduction of Human Voice\u201d,<br \/>\n14th Tri-University International Joint Seminar and Symposium, pp. 347-352, 2007<br \/>\n[18] Atsushi Todo and Hideyuki Sawada,\u201dEstimation of musical pitch by using comb filters for the identification of musical instruments\u201d,<br \/>\nSICE Annual Conference 2007, pp. 660-663, 2007<br \/>\n[19] Yohsuke Mizukami, Keiji Uchida and Hideyuki Sawada,\u201dTransmission of Stroking Sensation on a Skin by Higher-psychological Perception\u201d,<br \/>\nSICE Annual Conference 2007, pp. 1873-1876, 2007<br \/>\n[20] Hideyuki Sawada, \u201cHuman Interface Studies for the Communication between Human and Systems\u201d,<br \/>\nEducational Research Exchange Joint Symposium (EDUREJS), p. 9, 2007<br \/>\n[21] Keishi Fukuyama, Yohsuke Mizukami and Hideyuki Sawada, \u201cPresentation of Tactile Sensation Using the Micro Vibration of Shape Memory Alloy\u201d,<br \/>\nEducational Research Exchange Joint Symposium (EDUREJS), p. 61, 2007<br \/>\n[22] \u6c34\u4e0a\u967d\u4ecb\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5fae\u5c11\u632f\u52d5\u5b50\u3092\u7528\u3044\u305f\u89e6\u899a\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u3068\u99c6\u52d5\u4fe1\u53f7\u306e\u767a\u751f\u78ba\u7387\u5bc6\u5ea6\u5236\u5fa1\u306b\u3088\u308b\u89e6\u899a\u611f\u899a\u306e\u5448\u793a\u300d\u3001<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f32008 \u8ad6\u6587\u96c6, pp. 195-202, 2008<br \/>\n[23]\u00a0<span class=\"style8\">IEEE Best Presentation Award<\/span>, Yasumori Hayashi and Hideyuki Sawada, \u201cAnalysis of Acquired Vocal Sounds by a Talking Robot\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.335, 2007\u5e749\u670829\u65e5<br \/>\n[24] Yohsuke Mizukami, Keiji Uchida, Shinya Makino and Hideyuki Sawada, \u201cPresentation of tactile sensations using the vibration of SMA thread\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.332, 2007<br \/>\n[25] Takuma Ise and Hideyuki SAWADA, \u201cExtraction of voice characteristics for speech training of auditory impaired people\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.343, 2007<br \/>\n[26] Atsushi Todo, Khairunizam Wan and Hideyuki Sawada, \u201cSensor Fusion Technique for Human Tracking\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.344, 2007<br \/>\n[27] \u6fa4\u7530\u79c0\u4e4b\u3001\u6c34\u4e0a\u967d\u4ecb\u3001\u798f\u5c71\u60e0\u58eb\u3001\u5185\u7530\u5553\u6cbb\uff1a\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u7cf8\u306b\u3088\u308b\u5fae\u5c11\u632f\u52d5\u5b50\u30a2\u30ec\u30a4\u3068\u89e6\u899a\u5448\u793a\u300d,<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u7b2c8\u56de\u30b7\u30b9\u30c6\u30e0\u30a4\u30f3\u30c6\u30b0\u30ec\u30fc\u30b7\u30e7\u30f3\u90e8\u9580\u8b1b\u6f14\u4f1a(SI2007)\u3001pp. 35-36, 2007<br \/>\n[28] \u6fa4\u7530\u79c0\u4e4b\u3001\u6c34\u4e0a\u967d\u4ecb\u3001\u798f\u5c71\u60e0\u58eb\u3001\u5185\u7530\u5553\u6cbb\u3001\u91d1\u5b50\u771f\uff1a\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u7cf8\u3092\u5229\u7528\u3057\u305f\u5fae\u5c11\u632f\u52d5\u30a2\u30af\u30c1\u30e5\u30a8\u30fc\u30bf\u3068\u305d\u306e\u7279\u6027\u89e3\u6790\u300d,<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u7b2c8\u56de\u30b7\u30b9\u30c6\u30e0\u30a4\u30f3\u30c6\u30b0\u30ec\u30fc\u30b7\u30e7\u30f3\u90e8\u9580\u8b1b\u6f14\u4f1a(SI2007)\u3001pp. 67-68, 2007<br \/>\n[29] \u6fa4\u7530\u79c0\u4e4b\u3001\u4e2d\u6751\u5149\u5b8f\u3001\u6797\u606d\u5b88\u3001\u6728\u8c37\u5149\u6765\uff1a\u300c\u767a\u8a71\u30ed\u30dc\u30c3\u30c8\u3092\u7528\u3044\u305f\u8074\u899a\u969c\u788d\u8005\u306e\u305f\u3081\u306e\u767a\u8a71\u8a13\u7df4\u300d,<br \/>\n\u96fb\u5b50\u60c5\u5831\u901a\u4fe1\u5b66\u4f1a \u30d2\u30e5\u30fc\u30de\u30f3\u30b3\u30df\u30e5\u30cb\u30b1\u30fc\u30b7\u30e7\u30f3\u57fa\u790e\uff08HCS\uff09\u7814\u7a76\u4f1a HCS2007-24, pp. 125-130, 2007<br \/>\n<\/span><\/p>\n<p class=\"style3\"><span class=\"style4\"><strong>2006\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/strong><br \/>\n<\/span><span class=\"style6\">\u3000 [1]\u00a0<span class=\"style8\">\u30d2\u30e5\u30fc\u30de\u30f3\u30b3\u30df\u30e5\u30cb\u30b1\u30fc\u30b7\u30e7\u30f3\u8cde \u53d7\u8cde\uff08\u96fb\u5b50\u60c5\u5831\u901a\u4fe1\u5b66\u4f1a \u30d2\u30e5\u30fc\u30de\u30f3\u30b3\u30df\u30e5\u30cb\u30b1\u30fc\u30b7\u30e7\u30f3\u30b0\u30eb\u30fc\u30d7\uff09<\/span>\u3001<br \/>\n\u8ad6\u6587 \u300c\u89e6\u899a\u5448\u793a\u30c7\u30d0\u30a4\u30b9\u3092\u7528\u3044\u305f\u306a\u305e\u308a\u611f\u899a\u306e\u5448\u793a\u300d\u3001\u4fe1\u5b66\u6280\u5831 HCS2006-13\u3001Vol.106, No.83, pp. 67-72, 2007\u5e743\u670824\u65e5<br \/>\n[2]\u00a0<span class=\"style8\">\u7814\u7a76\u4f1a\u8cde \u53d7\u8cde \uff08\u30d2\u30e5\u30fc\u30de\u30f3\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u5b66\u4f1a\uff09<\/span>\u3001\u8ad6\u6587 \u300c\u89e6\u899a\u5448\u793a\u30c7\u30d0\u30a4\u30b9\u3092\u7528\u3044\u305f\u306a\u305e\u308a\u611f\u899a\u306e\u5448\u793a\u300d\u3001<br \/>\n\u30d2\u30e5\u30fc\u30de\u30f3\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u5b66\u4f1a \u7814\u7a76\u4f1a\u8ad6\u6587\u96c6\u3001Vol.8, No.2, pp. 67-72, 2007\u5e743\u67082\u65e5<br \/>\n[3]\u00a0<span class=\"style8\">\u30a4\u30f3\u30bf\u30e9\u30af\u30c6\u30a3\u30d6\u767a\u8868\u8cde \u53d7\u8cde \uff08\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f32007\uff09<\/span>\u3001<br \/>\n\u300c\u8584\u578b\u89e6\u899a\u5448\u793a\u30c7\u30d0\u30a4\u30b9\u306b\u3088\u308b\u9ad8\u6b21\u77e5\u899a\u3092\u5229\u7528\u3057\u305f\u89e6\u899a\u60c5\u5831\u5448\u793a\u300d\u3001\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f3\u8ad6\u6587\u96c6, pp. 121-128, 2007\u5e743\u670816\u65e5<br \/>\n[4] \u6fa4\u7530\u79c0\u4e4b\u3001\u5927\u52a0\u6238\u7a14\uff1a\u300c\u96d1\u97f3\u74b0\u5883\u4e0b\u306b\u304a\u3051\u308b\u97f3\u58f0\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u69cb\u7bc9\u306e\u305f\u3081\u306e\u7279\u5b9a\u8a71\u8005\u306e\u30bb\u30f3\u30b7\u30f3\u30b0\u300d,<br \/>\n\u96fb\u6c17\u5b66\u4f1a\u8ad6\u6587\u8a8cD, Vol.126, No.11, pp.1446-1453, 2006<br \/>\n[5] Yosuke Mizukami and Hideyuki Sawada, \u201cTactile information transmission by apparent movement phenomenon using shape-memory alloy device\u201d,<br \/>\nInternational Journal on Disability and Human Development, Vol.5, No.3, pp.277-284, 2006<br \/>\n[6] Didier Coquin, Eric Benoit, Hideyuki Sawada and Bogdan Ionescu, \u201cGestures Recognition Based on the Fusion of Hand Positioning and Arm Gestures\u201d,<br \/>\nJournal of Robotics and Mechatronics, Vol.18, No.6, pp.751-759, 2006<br \/>\n[7] Hideyuki Sawada, Norio Takeuchi and Akihiro Hisada, \u201cA realtime clarification filter of dysphonic speech and its evaluation by listening experiments\u201d,<br \/>\nInternational Journal on Disability and Human Development, Vol.4, No.3, pp.183-189, 2005<br \/>\n[8] Yosuke Mizukami and Hideyuki Sawada, \u201cTactile Information Transmission by Apparent Movement Phenomenon Using Shape-memory Alloy Device\u201d,<br \/>\nInternational Conference on Disability, Virtual Reality and Associated Technologies, pp. 133-140, 2006<br \/>\n[9] Mitsuhiro Nakamura and Hideyuki Sawada, \u201cTalking Robot and the Analysis of Autonomous Voice Acquisition\u201d,<br \/>\nIEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS2006), pp. 4684-4689, 2006<br \/>\n[10] Hideyuki Sawada, Atsushi Todo and Toshiya Takechi, \u201cRealtime Speaker Tracking for Robotic Auditory System\u201d,<br \/>\nAnnual Conference of the IEEE Industrial Electronics Society (IECON2006), pp.5474-5479, 2006<br \/>\n[11] \u6c34\u4e0a\u967d\u4ecb\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u8584\u578b\u89e6\u899a\u5448\u793a\u30c7\u30d0\u30a4\u30b9\u306b\u3088\u308b\u9ad8\u6b21\u77e5\u899a\u3092\u5229\u7528\u3057\u305f\u89e6\u899a\u60c5\u5831\u5448\u793a\u300d\u3001<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f32007 \u8ad6\u6587\u96c6, pp. 121-128, 2007<br \/>\n[12] \u6c34\u4e0a\u967d\u4ecb\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u89e6\u899a\u5448\u793a\u30c7\u30d0\u30a4\u30b9\u3092\u7528\u3044\u305f\u306a\u305e\u308a\u611f\u899a\u306e\u5448\u793a\u300d\u3001<br \/>\n\u96fb\u5b50\u60c5\u5831\u901a\u4fe1\u5b66\u4f1a \u6280\u8853\u7814\u7a76\u5831\u544a HCS2006-13\u3001Vol.106, No.83, pp. 67-72, 2006<br \/>\n[13]\u00a0<span class=\"style8\">\u30d9\u30b9\u30c8\u30d7\u30ec\u30bc\u30f3\u30c6\u30fc\u30b7\u30e7\u30f3\u8cde \u53d7\u8cde<\/span>\u3001\u4e2d\u6751\u5149\u5b8f\u3001\u6797\u606d\u5b88\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u767a\u8a71\u30ed\u30dc\u30c3\u30c8\u306e\u767a\u8a71\u52d5\u4f5c\u7372\u5f97\u3068\u6b4c\u58f0\u306e\u751f\u6210\u300d\u3001<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u97f3\u697d\u60c5\u5831\u79d1\u5b66\u7814\u7a76\u4f1a \u590f\u306e\u30b7\u30f3\u30dd\u30b8\u30a6\u30e0\u3001\u60c5\u5831\u51e6\u7406\u5b66\u4f1a\u7814\u7a76\u5831\u544a 2006-MUS-66, pp. 135-140, 2006<br \/>\n[14] \u9ad8\u7530\u6210\u5fb3\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u30b8\u30a7\u30b9\u30c1\u30e3\u304b\u3089\u306e\u30a4\u30f3\u30c6\u30f3\u30b7\u30e7\u30f3\u62bd\u51fa\u306b\u95a2\u3059\u308b\u4e00\u8003\u5bdf\u300d\u3001<br \/>\n\u96fb\u5b50\u60c5\u5831\u901a\u4fe1\u5b66\u4f1a \u30d2\u30e5\u30fc\u30de\u30f3\u30b3\u30df\u30e5\u30cb\u30b1\u30fc\u30b7\u30e7\u30f3\u57fa\u790e\uff08HCS\uff09\u7814\u7a76\u4f1a HCS2006-8, pp. 37-42, 2006<br \/>\n[15] \u6fa4\u7530\u79c0\u4e4b\u3001\u6c34\u4e0a\u967d\u4ecb\uff1a\u300c\u89e6\u899a\u306e\u9ad8\u6b21\u77e5\u899a\u3092\u5229\u7528\u3057\u305f\u632f\u52d5\u523a\u6fc0\u306b\u3088\u308b\u89e6\u899a\u5448\u793a\u300d\u3001<br \/>\n\u96fb\u6c17\u5b66\u4f1a \u7523\u696d\u30b7\u30b9\u30c6\u30e0\u60c5\u5831\u5316\u7814\u7a76\u4f1a IIS-06-49, pp.23-28, 2006<br \/>\n[16]\u00a0<span class=\"style8\">IEEE Presentation Award<\/span>, Wan Khairunizam and Hideyuki Sawada, \u201cModel-Based Estimation for 3D Motion of Human Upper Body\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.340, 2006\u5e749\u670826\u65e5<br \/>\n[17]\u00a0<span class=\"style8\">IEEE Presentation Award<\/span>, Shigenori Takata and Hideyuki Sawada, \u201cExtraction of Intention from Gestures based on the trajectory resampling algorithm\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.341, 2006\u5e749\u670826\u65e5<br \/>\n[18] Atsushi Todo and Hideyuki Sawada, \u201cIdentification of musical instrument by a robotic auditory system\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.342, 2006<br \/>\n[19]\u00a0<span class=\"style8\">IEEE Presentation Award<\/span>, Mitsuhiro Nakamura and Hideyuki Sawada, \u201cVoice Acquisition of a Talking Robot Using a Self-organizing Map\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.343, 2006\u5e749\u670826\u65e5<br \/>\n[20] Takuma Ise and Hideyuki Sawada, \u201cAcoustic analysis of esophageal speech\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.344, 2006<br \/>\n<\/span><\/p>\n<p class=\"style3\"><span class=\"style4\"><strong>2005\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/strong><br \/>\n<\/span><span class=\"style6\">\u3000 [1] Shuji Hashimoto and Hideyuki Sawada, \u201cA Grasping Device to Sense Hand Gesture for Expressive Sound Generation\u201d,<br \/>\nJournal of New Music Research, Vol.34, No.1, pp.115-123, 2005<br \/>\n[2] Hideyuki Sawada and Toshiya Takechi, \u201cA Robotic Auditory System which Interacts with Musical Sounds\u201d,<br \/>\nInternational Conference on Intelligent Technologies, pp.362-367, 2005<br \/>\n[3] Hideyuki Sawada and Mitsuhiro Nakamura, \u201cA Talking Robot and Its Singing Skill Acquisition\u201d,<br \/>\nInternational Conference on Knowledge-Based Intelligent Information and Engineering Systems, pp.898-907, 2005<br \/>\n[4] Didier Coquin, Eric Benoit, Hideyuki Sawada and Bogdan Ionescu, \u201cFusion of Hand and Arm Gestures\u201d,<br \/>\nSPIE Optomechatronic Machine Vision, Vol. 6051, 6051-14, 2005<br \/>\n[5] Hideyuki Sawada, Shigenori Takata and Eric Benoit, \u201cExtraction of Individuality in Emotional Gestures Based on Acceleration\u201d,<br \/>\nInternational Workshop on Research and Education in Mechatronics, pp.75-80, 2005<br \/>\n[6] Eric Benoit, Didier Coquin and Hideyuki Sawada, \u201cDistributed data fusion applied to human gesture measurement\u201d,<br \/>\nInternational Workshop on Research and Education in Mechatronics, pp.92-96, 2005<br \/>\n[7] Hideyuki Sawada and Minoru Ohkado, \u201cRealtime Speaker Identification Technique in Noisy Environment\u201d,<br \/>\nInternational Workshop on Research and Education in Mechatronics, pp.124-129, 2005<br \/>\n[8] Hideyuki Sawada and Mitsuhiro Nakamura, \u201cA Talking Robot which Mimics Human Vocalization\u201d,<br \/>\nInternational Workshop on Research and Education in Mechatronics, pp.130-135, 2005<br \/>\n[9] \u6c34\u4e0a\u967d\u4ecb\u3001\u5185\u7530\u5553\u6cbb\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u5f62\u72b6\u8a18\u61b6\u5408\u91d1\u3092\u7528\u3044\u305f\u89e6\u899a\u63d0\u793a\u30c7\u30d0\u30a4\u30b9\u306b\u3088\u308b\u30d5\u30a1\u30f3\u30c8\u30e0\u30bb\u30f3\u30bb\u30fc\u30b7\u30e7\u30f3\u3068\u4eee\u73fe\u904b\u52d5\u306e\u63d0\u793a\u300d\u3001<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.346, 2005<br \/>\n[10] Shigenori Takata, Eric Benoit and Hideyuki Sawada, \u201cExtraction of Intention and Individuality in Gestures based on Velocity and Acceleration\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.353, 2005<br \/>\n[11] Toshiya Takechi, Yasuyuki Ishihara, Takashi Mandono and Hideyuki Sawada, \u201cAutomobile identification in noisy environment\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.376, 2005<br \/>\n[12] Tsuyoshi Yoshizaki and Hideyuki Sawada, \u201cClarification of Esophageal Speech with the Preservation of Tonal Features\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.384, 2005<br \/>\n<\/span><\/p>\n<p class=\"style3\"><strong><span class=\"style4\">2004\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/span><\/strong><br \/>\n<span class=\"style6\">\u3000 [1]\u00a0<span class=\"style8\">Hyper Human Tech Award \u53d7\u8cde<\/span><br \/>\nHideyuki Sawada and Mitsuhiro Nakamura: \u201cMechanical Voice System and its Singing Performance\u201d,<br \/>\nIEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS2004), pp. 1920-1925, 2004\u5e7410\u67081\u65e5<br \/>\n[2] \u6fa4\u7530\u79c0\u4e4b\uff1a \u300c\u4eba\u5de5\u97f3\u58f0\uff5e\u58f0\u9053\u7269\u7406\u30e2\u30c7\u30eb\u306e\u8074\u899a\u30d5\u30a3\u30fc\u30c9\u30d0\u30c3\u30af\u5236\u5fa1\u306b\u57fa\u3065\u304f\u97f3\u58f0\u751f\u6210\u300d,<br \/>\n\u65e5\u672c\u6a5f\u68b0\u5b66\u4f1a\u8a8c, Vol.107, No.1033, pp.33-35, 2004<br \/>\n[3] Hideyuki Sawada, Takumi Ukegawa and Eric Benoit: \u201cRobust Gesture Recognition by Possibilistic Approach based on Data Resampling\u201d,<br \/>\nInternational Workshop on Fuzzy Systems &amp; Innovational Computing (FIC2004), pp. 168-173, 2004<br \/>\n[4] Hideyuki Sawada and Mitsuhiro Nakamura: \u201cA Talking and Singing Robot based on the Mechanical Construction of Human Vocal System\u201d,<br \/>\nInternational Conference on Intelligent Mechatronics and Automation, pp. 89-94, 2004<br \/>\n[5] Hideyuki Sawada, Norio Takeuchi and Akihiro Hisada: \u201cA Real-time Clarification Filter of a Dysphonic Speech and Its Evaluation by Listening Experiments\u201d,<br \/>\nInternational Conference on Disability, Virtual Reality and Associated Technologies (ICDVRAT2004), pp. 239-246, 2004<br \/>\n[6] Hideyuki Sawada and Minoru Ohkado: \u201cIdentification and tracking of particular speaker in noisy environment\u201d,<br \/>\nInternational Conference on Machine Vision and its Optomechatronic Applications, OpticsEast, SPIE International Society for Optical Engineering, pp. 138-145, 2004<br \/>\n[7] Toshiya Takechi, koichi Sugimoto, Takashi Mandono and Hideyuki Sawada: \u201cAutomobile identification based on the measurement of car sounds\u201d,<br \/>\nAnnual Conference of the IEEE Industrial Electronics Society, TD6-4, 2004<br \/>\n[8] Minoru Ohkado and Hideyuki Sawada: \u201cDetection and Identification of Particular Speaker in Noisy Environment\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.327, 2004<br \/>\n[9]\u00a0<span class=\"style8\">IEEE Best Presentation Prize \u53d7\u8cde<\/span><br \/>\nToshiya Takechi and Hideyuki Sawada: \u201cEstimation of car location based on the measurement of automobile sounds\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.328, 2004<br \/>\n[10] Tsuyoshi Yoshizaki and Hideyuki Sawada: \u201cRealtime voice transformation with the preservation of speaker\u2019s individuality\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.329, 2004<br \/>\n[11] Mitsuhiro Nakamura and Hideyuki Sawada: \u201cSinging Performance by a Mechanical Voice System\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.330, 2004<br \/>\n[12] Hidefumi Moritani and Hideyuki Sawada: \u201cIdentification of a particular person by facial parts extraction\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.331, 2004<br \/>\n[13] Shigenori Takata and Hideyuki Sawada: \u201cSize and Speed-independent Gesture Recognition based on Possibilistic Approach\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.332, 2004<br \/>\n[14] \u6fa4\u7530\u79c0\u4e4b\u3001\u7af9\u5185\u7d00\u592b\u3001\u4e45\u7530\u7ae0\u5f18\uff1a \u300c\u98df\u9053\u767a\u58f0\u30fb\u8133\u6027\u9ebb\u75fa\u60a3\u8005\u97f3\u58f0\u306e\u660e\u77ad\u5316\u30d5\u30a3\u30eb\u30bf\u30ea\u30f3\u30b0\u3068DSP\u3078\u306e\u5b9f\u88c5\u300d\u3001<br \/>\n\u96fb\u5b50\u60c5\u5831\u901a\u4fe1\u5b66\u4f1a \u6280\u8853\u7814\u7a76\u5831\u544a HCS2004-4, pp. 17-22, 2004<br \/>\n[15] \u68ee\u8c37\u79c0\u6587\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a \u300c\u9854\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u69cb\u7bc9\u306e\u305f\u3081\u306e\u8996\u7dda\u30fb\u9854\u90e8\u54c1\u8a8d\u8b58\u300d\u3001<br \/>\n\u96fb\u5b50\u60c5\u5831\u901a\u4fe1\u5b66\u4f1a \u6280\u8853\u7814\u7a76\u5831\u544a HCS2004-5, pp. 23-28, 2004<br \/>\n[16] \u6fa4\u7530\u79c0\u4e4b\u3001\u4e2d\u6751\u5149\u5b8f\uff1a \u300c\u8074\u899a\u30d5\u30a3\u30fc\u30c9\u30d0\u30c3\u30af\u306b\u3088\u308b\u767a\u8a71\u30ed\u30dc\u30c3\u30c8\u306e\u58f0\u771f\u4f3c\u52d5\u4f5c\u306e\u751f\u6210\u300d\u3001<br \/>\n\u96fb\u5b50\u60c5\u5831\u901a\u4fe1\u5b66\u4f1a \u6280\u8853\u7814\u7a76\u5831\u544a HCS2004-25, pp. 19-24, 2004<br \/>\n<\/span><\/p>\n<p class=\"style3\"><strong><span class=\"style4\">2003\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/span><\/strong><br \/>\n<span class=\"style6\">\u3000 [1] \u4e45\u7530\u7ae0\u5f18\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a \u300c\u30b9\u30da\u30af\u30c8\u30eb\u5305\u7d61\u3068\u8abf\u6ce2\u69cb\u9020\u306e\u5f37\u8abf\u51e6\u7406\u306b\u3088\u308b\u98df\u9053\u767a\u58f0\u97f3\u58f0\u306e\u5b9f\u6642\u9593\u660e\u77ad\u5316\u300d\u3001<br \/>\n\u30d2\u30e5\u30fc\u30de\u30f3\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u5b66\u4f1a\u8a8c, Vol.5, No.4, pp.447-454, 2003<br \/>\n[2] \u8f3f\u6c34\u5927\u548c\u3001\u8c9d\u539f\u4fca\u4e5f\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a \u300c\u6b21\u4e16\u4ee3\u578b\u751f\u7523\u30fb\u6d41\u901a\u30b7\u30b9\u30c6\u30e0\u306e\u69cb\u7bc9\u306b\u5411\u3051\u3066\uff0d\u4eba\u9593\u4e2d\u5fc3\u306e\u611f\u6027\u751f\u7523\u30b7\u30b9\u30c6\u30e0\u3078\u306e\u4e00\u63d0\u6848\uff0d\u300d\u3001<br \/>\n\u96fb\u6c17\u5b66\u4f1a\u8ad6\u6587\u8a8cD, Vol.124, No.1, pp.1-7, 2004<br \/>\n[3] Antonio Camurri, Gualtiero Volpe (Eds.): \u201cGesture-Based Communication in Human-Computer Interaction\u201d,<br \/>\npp. 386-398, Springer-Verlag, 2004, ISBN 3-540-21072-5<br \/>\n[4] Hidefumi Moritani, Yuki Kawai and Hideyuki Sawada: \u201cIntuitive Manipulation of a Haptic Monitor for the Gestural Human-computer Interaction\u201d,<br \/>\nInternational Workshop on Gesture and Sign-Language based Human-Computer Interaction,pp.5-, 2003<br \/>\n[5] Toshio Higashimoto and Hideyuki Sawada: \u201cA Mechanical Voice System and its Adaptive Learning for the Mimicry of Human Vocalization\u201d,<br \/>\nIEEE International Symposium on Computational Intelligence in Robotics and Automation, pp. 1040-1045, 2003<br \/>\n[6] Eric Benoit, Thomas Allevard, Takumi Ukegawa, Hideyuki Sawada: \u201cFuzzy Sensor for Gesture Recognition based on Motion and Shape Recognition of Hand\u201d,<br \/>\nIEEE International Symposium on Virtual Environments, Human-Computer Interfaces and Measurement Systems, pp.63-67, 2003<br \/>\n[7] Minoru Ohkado and Hideyuki Sawada: \u201cRealtime Detection and Identification of Plural Speakers Using a Microphone Array\u201d,<br \/>\nIEEE International Symposium on Virtual Environments, Human-Computer Interfaces and Measurement Systems, pp.151-156, 2003<br \/>\n[8] Akihiro Hisada and Hideyuki Sawada: \u201cRealtime Filtering for the Clarification of Esophageal Speech\u201d,<br \/>\nJapan-China Workshop on Multidisciplinary Researches in Engineering, pp.53-60, 2003<br \/>\n[9] Yuki Kawai, Hidefumi Moritani and Hideyuki Sawada: \u201cIntuitive communication system driven by haptic and gestural actions\u201d,<br \/>\nJapan-China Workshop on Multidisciplinary Researches in Engineering, pp.91-98, 2003<br \/>\n[10] Toshio Higashimoto and Hideyuki Sawada: \u201cA Mechanical Voice System: Construction of Vocal Cords and its Pitch Control\u201d,<br \/>\nInternational Conference on Intelligent Technologies, pp. 762-768, 2003<br \/>\n[11]\u00a0<span class=\"style8\">Best Presentation Prize \u53d7\u8cde<\/span><br \/>\nAkihiro Hisada and Hideyuki Sawada: \u201cRealtime Clarification of Esophageal Speech\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.352, 2003<br \/>\n[12] Norio Takeuchi and Hideyuki Sawada: \u201cRealtime Clarification Filter for the Speech with Cerebral Palsy\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.353, 2003<br \/>\n[13] Toshiya Takechi and Hideyuki Sawada: \u201cEstimation of car location based on the measurement of automobile sounds\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.354, 2003<br \/>\n[14] Hidefumi Moritani and Hideyuki Sawada: \u201cRecognition of gaze and facial expressions for the realization of intuitive interface\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.355, 2003<br \/>\n[15] Yuki Kawai and Hideyuki Sawada: \u201cIntuitive communication system using a haptic monitor with a Pan-Tilt-Zoom camera\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.356, 2003<br \/>\n[16]\u00a0<span class=\"style8\">Best Presentation Prize \u53d7\u8cde<\/span><br \/>\nTakumi Ukegawa and Hideyuki Sawada: \u201cFuzzy glove for the Speed and Size-independent Gesture Recognition\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.357, 2003<br \/>\n[17] TUTORIAL \u201cTutorial of Advanced Topics in Computational Intelligence and Related Researches\u201d,<br \/>\nin Chiang Mai University, Thailand, April 29 \u2013 May 2, 2003<br \/>\n<\/span><\/p>\n<p class=\"style3\"><strong><span class=\"style4\">2002\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/span><\/strong><br \/>\n<span class=\"style6\">\u3000 [1] Toshio Higashimoto and Hideyuki Sawada: \u201cVocalization Control of a Mechanical Vocal System under the Auditory Feedback\u201d,<br \/>\nJournal of Robotics and Mechatronics, Vol.14, No.5, pp.453-461, 2002<br \/>\n[2] Toshio Higashimoto and Hideyuki Sawada: \u201cSpeech Production by a Mechanical Model: Construction of a Vocal Tract and Its Control by Neural Network\u201d,<br \/>\nIEEE International Conference on Robotics and Automation, pp.3858-3863, 2002<br \/>\n[3] Yuki Kawai and Hideyuki Sawada: \u201cIntuition-driven monitor: a monitor device driven by user\u2019s haptic manipulation\u201d,<br \/>\nIEEE International Conference on Systems, Man and Cybernetics, 2002<br \/>\n[4] Akihiro Hisada and Hideyuki Sawada: \u201cRealtime Clarification of Esophageal Speech Using a Comb Filter\u201d,<br \/>\nInternational Conference on Disability, Virtual Reality and Associated Technologies (ICDVRAT2002), pp.39-46, 2002<br \/>\n[5] Hideyuki Sawada, Yuki Kawai and Hidefumi Moritani: \u201cGestural Manipulation of Intuition-driven Monitor with the Detection of Human Sensory Factors\u201d,<br \/>\nInternational Conference of IEEE Industrial Electronics Society (IECON\u201902), pp.3019-3024, 2002<br \/>\n[6] \u6cb3\u5408\u6709\u8a18\u3001\u68ee\u8c37\u79c0\u6587\u3001\u6fa4\u7530\u79c0\u4e4b: \u300c\u529b\u899a\u304a\u3088\u3073\u8996\u7dda\u306b\u3088\u308a\u64cd\u4f5c\u53ef\u80fd\u306a\u30e2\u30cb\u30bf\u30c7\u30d0\u30a4\u30b9\u306e\u958b\u767a\u300d,<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u30d2\u30e5\u30fc\u30de\u30f3\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u7814\u7a76\u4f1a \u7814\u7a76\u5831\u544a 2002-HI-99, pp.1-8, 2002<br \/>\n[7] \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u9ad8\u9f62\u8005\u30fb\u969c\u788d\u8005\u652f\u63f4\u306e\u305f\u3081\u306e\u753b\u50cf\u51e6\u7406\u6280\u8853\u300d,<br \/>\n\u96fb\u6c17\u5b66\u4f1a\u3000\u60c5\u5831\u51e6\u7406\u30fb\u7523\u696d\u30b7\u30b9\u30c6\u30e0\u60c5\u5831\u5316\u5408\u540c\u7814\u7a76\u4f1a, pp.7-12, 2002<br \/>\n[8] Toshio Higashimoto and Hideyuki Sawada: \u201cConsonant Generation by a Mechanical Voice System\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.346, 2002<br \/>\n[9] Minoru Ohkado and Hideyuki Sawada: \u201cDetection and Recognition of plural speakers using a Microphone Array\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.347, 2002<br \/>\n[10] Hidefumi Moritani and Hideyuki Sawada: \u201cGaze Recognition for the Operation of the Intuition-driven Monitor\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.348, 2002<br \/>\n[11] Takumi Ukegawa and Hideyuki Sawada: \u201cSpeed and Size-independent Gesture Recognition using Fuzzy Theory\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.349, 2002<br \/>\n[12] \u6fa4\u7530\u79c0\u4e4b\u3001\u6cb3\u5408\u6709\u8a18\u3001\u68ee\u8c37\u79c0\u6587\uff1a \u300c\u8996\u7dda\u3068\u529b\u899a\u306b\u3088\u308a\u64cd\u4f5c\u53ef\u80fd\u306a\u30e2\u30cb\u30bf\u30c7\u30d0\u30a4\u30b9\u300d\u3001<br \/>\n\u5916\u89b3\u691c\u67fb\u306e\u81ea\u52d5\u5316\u30ef\u30fc\u30af\u30b7\u30e7\u30c3\u30d7 VIEW2002 \u8b1b\u6f14\u8ad6\u6587\u96c6, pp.25-30, 2002<br \/>\n[13] \u68ee\u8c37\u79c0\u6587\u3001\u6cb3\u5408\u6709\u8a18\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u529b\u899a\u64cd\u4f5c\u53ef\u80fd\u306a\u30e2\u30cb\u30bf\u30c7\u30d0\u30a4\u30b9\u3068\u8996\u7dda\u30fb\u8868\u60c5\u8a8d\u8b58\u300d\u3001<br \/>\n\u30d2\u30e5\u30fc\u30de\u30f3\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u5b66\u4f1a \u30ce\u30f3\u30d0\u30fc\u30d0\u30eb\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u7814\u7a76\u4f1a \u8ad6\u6587\u96c6\u3001 pp.43-49, 2002<br \/>\n[14] \u6fa4\u7530\u79c0\u4e4b\u3001\u6cb3\u5408\u6709\u8a18\u3001\u68ee\u8c37\u79c0\u6587\uff1a\u300c\u529b\u899a\u3068\u8996\u7dda\u3092\u5229\u7528\u3057\u305f\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u3068\u30c7\u30b6\u30a4\u30f3\u652f\u63f4\u300d\u3001<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u30b7\u30b9\u30c6\u30e0\u5de5\u5b66\u90e8\u4f1a \u7814\u7a76\u4f1a, pp.57-62, 2002<br \/>\n[15] \u5927\u52a0\u6238\u7a14\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u30de\u30a4\u30af\u30ed\u30d5\u30a9\u30f3\u30a2\u30ec\u30a4\u3092\u7528\u3044\u305f\u767a\u8a71\u8005\u306e\u691c\u51fa\u3068\u97f3\u58f0\u5f37\u8abf\u300d\u3001<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u3055\u306c\u304d\u5236\u5fa1\u7814\u7a76\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u30012002<br \/>\n[16] \u8acb\u5ddd\u62d3\u4e09\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u4f4d\u7f6e\u306e\u30ea\u30b5\u30f3\u30d7\u30ea\u30f3\u30b0\u306b\u3088\u308b\u30ed\u30d0\u30b9\u30c8\u306a\u30b8\u30a7\u30b9\u30c1\u30e3\u8a8d\u8b58\u300d\u3001<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u3055\u306c\u304d\u5236\u5fa1\u7814\u7a76\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u30012002<br \/>\n[17] \u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u767a\u58f0\u969c\u788d\u306b\u3088\u308b\u4e0d\u660e\u77ad\u97f3\u58f0\u306e\u660e\u77ad\u5316\u30d5\u30a3\u30eb\u30bf\u30ea\u30f3\u30b0\u306e\u8a66\u307f\u3068\u652f\u63f4\u6280\u8853\u958b\u767a\u306e\u53ef\u80fd\u6027\u300d\u3001<br \/>\n\u767a\u58f0\u969c\u788d\u306b\u3088\u308b\u4e0d\u660e\u77ad\u97f3\u58f0\u306e\u660e\u77ad\u5316\u3068\u30b3\u30df\u30e5\u30cb\u30b1\u30fc\u30b7\u30e7\u30f3\u652f\u63f4\u306b\u95a2\u3059\u308b\u7814\u7a76\u4f1a\u3001 2003<br \/>\n[18] \u6cb3\u5408\u6709\u8a18\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u529b\u899a\u64cd\u4f5c\u53ef\u80fd\u306a\u30e2\u30cb\u30bf\u306b\u3088\u308b\u30b8\u30a7\u30b9\u30c1\u30e3\u8a8d\u8b58\u300d\u3001<br \/>\n\u65e5\u672c\u6a5f\u68b0\u5b66\u4f1a \u30ed\u30dc\u30c6\u30a3\u30af\u30b9\u30fb\u30e1\u30ab\u30c8\u30ed\u30cb\u30af\u30b9\u8b1b\u6f14\u4f1a ROBOMEC\u201902\u30012002<br \/>\n<\/span><\/p>\n<p class=\"style3\"><strong><span class=\"style4\">2001\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/span><\/strong><br \/>\n<span class=\"style6\">\u3000 [1] \u91ce\u6d25\u62d3\u4eba\u3001\u6a4b\u672c\u5468\u53f8\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u30de\u30eb\u30c1\u30e2\u30fc\u30c0\u30eb\u624b\u8a71\u30c7\u30fc\u30bf\u30d9\u30fc\u30b9\u3068\u30b8\u30a7\u30b9\u30c1\u30e3\u3092\u5165\u529b\u3068\u3057\u305f\u30c7\u30fc\u30bf\u691c\u7d22\u300d,<br \/>\n\u30d2\u30e5\u30fc\u30de\u30f3\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u5b66\u4f1a\u8a8c, Vol.4, No.1, pp.51-58, 2002<br \/>\n[2] \u6771\u672c\u654f\u7537\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u58f0\u9053\u7269\u7406\u30e2\u30c7\u30eb\u306b\u3088\u308b\u97f3\u97ff\u751f\u6210\u300d\u3001<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f3\u8ad6\u6587\u96c6\u3001\u3000pp.125-132\u30012002<br \/>\n[3] \u4e45\u7530\u7ae0\u5f18\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u6adb\u5f62\u30d5\u30a3\u30eb\u30bf\u306b\u3088\u308b\u98df\u9053\u767a\u58f0\u97f3\u58f0\u306e\u5b9f\u6642\u9593\u660e\u77ad\u5316\u30d5\u30a3\u30eb\u30bf\u30ea\u30f3\u30b0\u300d\u3001<br \/>\n\u96fb\u5b50\u60c5\u5831\u901a\u4fe1\u5b66\u4f1a \u798f\u7949\u60c5\u5831\u5de5\u5b66\u7814\u7a76\u4f1a\u5831\u544a\u3001WIT2001-49\u3001pp.37-42\u30012002<br \/>\n[4] Hideyuki Sawada and Shuji Hashimoto: \u201cMulti-modal Gesture Database and Its Retrieval by Gesture Inputs\u201d,<br \/>\nProc. Int\u2019l Conf. Quality Control by Artificial Vision Vol.2, pp.584-589, 2001<br \/>\n[5] \u6cb3\u5408\u6709\u8a18\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u76f4\u611f\u7684\u306a\u529b\u899a\u64cd\u4f5c\u3092\u53ef\u80fd\u3068\u3059\u308b\u30e2\u30cb\u30bf\u306e\u958b\u767a\u300d<br \/>\n\u7cbe\u5bc6\u5de5\u5b66\u4f1a \u77e5\u80fd\u30e1\u30ab\u30c8\u30ed\u30cb\u30af\u30b9\u7814\u7a76\u4f1a\u5831\u544a Vol.6, No.4, pp.27-32, 2002<br \/>\n[6] Yuki Kawai and Hideyuki Sawada: \u201cDevelopment of Intuition-driven Monitor for Haptic Communication\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a\u3000\u56db\u56fd\u652f\u90e8\u5927\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001p.302\u30012001<br \/>\n[7] Toshio Higashimoto and Hideyuki Sawada: \u300c\u58f0\u9053\u7269\u7406\u30e2\u30c7\u30eb\u306e\u8074\u899a\u30d5\u30a3\u30fc\u30c9\u30d0\u30c3\u30af\u306b\u3088\u308b\u97f3\u97ff\u751f\u6210\u300d\u3001<br \/>\n\u65e5\u672c\u6a5f\u68b0\u5b66\u4f1a \u30ed\u30dc\u30c6\u30a3\u30af\u30b9\u30fb\u30e1\u30ab\u30c8\u30ed\u30cb\u30af\u30b9\u8b1b\u6f14\u4f1a ROBOMEC\u201901\u30012001<br \/>\n[8] \u4e45\u7530\u7ae0\u5f18\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u98df\u9053\u767a\u58f0\u97f3\u58f0\u306e\u660e\u77ad\u5316\u30d5\u30a3\u30eb\u30bf\u300d\u3001<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u3055\u306c\u304d\u5236\u5fa1\u7814\u7a76\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001pp.9-14, 2002<br \/>\n[9] \u6771\u672c\u654f\u7537\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u30cb\u30e5\u30fc\u30e9\u30eb\u30cd\u30c3\u30c8\u30ef\u30fc\u30af\u306b\u3088\u308b\u58f0\u9053\u7269\u7406\u30e2\u30c7\u30eb\u306e\u5236\u5fa1\u300d\u3001<br \/>\n\u8a08\u6e2c\u81ea\u52d5\u5236\u5fa1\u5b66\u4f1a \u3055\u306c\u304d\u5236\u5fa1\u7814\u7a76\u4f1a \u8b1b\u6f14\u8ad6\u6587\u96c6\u3001pp.14-21, 2002<br \/>\n<\/span><\/p>\n<p class=\"style3\"><strong><span class=\"style4\">2000\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/span><\/strong><br \/>\n<span class=\"style6\">\u3000 [1] Hideyuki Sawada and Shuji Hashimoto: \u201cMechanical Model of Human Vocal System and Its Control with Auditory Feedback\u201d,<br \/>\nJSME International Journal, Series C, Vol.43, No.3, pp.645-652, 2000<br \/>\n[2] Hideyuki Sawada and Shuji Hashimoto: \u201cMechanical Construction of a Human Vocal System for Singing Voice Production\u201d,<br \/>\nAdvanced Robotics, International Journal of Robotics Society of Japan, Vol.13, No.7, pp.647-661, 2000<br \/>\n[3] \u6fa4\u7530\u79c0\u4e4b\u3001\u6a4b\u672c\u5468\u53f8\uff1a \u300c\u30d2\u30e5\u30fc\u30de\u30f3\u30fb\u30de\u30b7\u30f3\u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f3\u3092\u76ee\u6307\u3057\u305f\u30b8\u30a7\u30b9\u30c1\u30e3\u8a8d\u8b58\u300d,<br \/>\n\u96fb\u6c17\u5b66\u4f1a \u7523\u696d\u30b7\u30b9\u30c6\u30e0\u60c5\u5831\u5316\u7814\u7a76\u4f1a, pp.43-48, 2000<br \/>\n[4] \u300c\u30d1\u30fc\u30bd\u30ca\u30eb\u74b0\u5883\u306e\u5ba4\u5185\u30e2\u30c7\u30eb\u3092\u7528\u3044\u305f\u8cfc\u5165\u5546\u54c1\u8a55\u4fa1\u30b7\u30b9\u30c6\u30e0\u306e\u69cb\u7bc9\u300d,<br \/>\n\u96fb\u6c17\u5b66\u4f1a \u7523\u696d\u30b7\u30b9\u30c6\u30e0\u60c5\u5831\u5316\u7814\u7a76\u4f1a, pp.1-4, 2000<br \/>\n[5] \u6fa4\u7530\u79c0\u4e4b\u3001\u6a4b\u672c\u5468\u53f8\uff1a\u300c\u30de\u30eb\u30c1\u30e2\u30fc\u30c0\u30eb\u30b8\u30a7\u30b9\u30c1\u30e3\u30c7\u30fc\u30bf\u30d9\u30fc\u30b9\u3068\u305d\u306e\u30c7\u30fc\u30bf\u691c\u7d22\u300d,<br \/>\n\u7b2c5\u56de\u77e5\u80fd\u30e1\u30ab\u30c8\u30ed\u30cb\u30af\u30b9\u30ef\u30fc\u30af\u30b7\u30e7\u30c3\u30d7\u8b1b\u6f14\u8ad6\u6587\u96c6, pp.181-186, 2000<br \/>\n[6] \u300c\u58f0\u9053\u7269\u7406\u30e2\u30c7\u30eb\u306b\u3088\u308b\u97f3\u97ff\u751f\u6210\u300d,<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u97f3\u697d\u60c5\u5831\u79d1\u5b66 \u7814\u7a76\u5831\u544a, pp.13-18, 2000<br \/>\n<\/span><\/p>\n<p class=\"style3\"><strong><span class=\"style4\">1999\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/span><\/strong><br \/>\n<span class=\"style6\">\u3000 [1] Hideyuki Sawada and Shuji Hashimoto: \u201cMechanical Construction of Human Vocal System as a Musical Instrument\u201d,<br \/>\nProceedings of Pioneering International Symposium on Motion and Vibration Control in Mechatronics \u201999 (JSME), pp. 86-91<br \/>\n[2] Seiji Hata and Hideyuki Sawada: \u201cPersonal Room Modeling System and Gesture Recognition System for Personal Design Evaluation\u201d,<br \/>\nProceedings of International Conference on Quality Control by Artificial Vision QCAV\u201999, 1999<br \/>\n[3] Hideyuki Sawada, Takuto Notsu and Shuji Hashimoto: \u201cGesture-Sensitive Interface for Human-Machine Interaction\u201d,<br \/>\nProceedings of International Conference on Quality Control by Artificial Vision QCAV\u201999, 1999<br \/>\n[4] Hideyuki Sawada, Seiji Hata and Shuji Hashimoto: \u201cGesture Recognition for Human-Friendly Interface in Designer \u2013 Consumer Cooperate Design System\u201d,<br \/>\nProceedings of IEEE International Workshop on Robot and Human Communication (RO-MAN) \u201999, 1999, pp.400-405<br \/>\n[5] Hideyuki Sawada: \u201cGesture Recognition for Human-Friendly Interface\u201d,<br \/>\n\u96fb\u6c17\u95a2\u4fc2\u5b66\u4f1a \u56db\u56fd\u652f\u90e8\u9023\u5408\u5927\u4f1a, 1999, p.297<br \/>\n[6] \u91ce\u6d25\u5353\u4eba\u3001\u6a4b\u672c\u5468\u53f8\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u624b\u8a71\u5358\u8a9e\u30c7\u30fc\u30bf\u30d9\u30fc\u30b9\u3068\u624b\u8a71\u304b\u3089\u306e\u5358\u8a9e\u691c\u7d22\u300d,<br \/>\n\u96fb\u5b50\u60c5\u5831\u901a\u4fe1\u5b66\u4f1a\u3000\u7b2c1\u56de\u798f\u7949\u60c5\u5831\u5de5\u5b66\u7814\u7a76\u4f1a, WIT 99-4, pp.23-28, 1999<br \/>\n[7] Hideyuki Sawada and Shuji Hashimoto: \u201cA Haptic Device Driven by Grasping Force For Hand Gesture Tele-Communication\u201d,<br \/>\nProceedings of the ASME Dynamic Systems and Control Division \u2013 1999, pp.437-444, 1999 ASME International Mechanical Engineering Congress and Exposition<br \/>\n[8] Takuto Notsu, Pitoyo Hartono, Shuji Hashimoto and Hideyuki Sawada: \u201cMulti-modal Gesture Database and Gesture Recognition Using Wearable Devices\u201d,<br \/>\nProceedings of 6th Korea-Japan Joint Workshop on Computer Vision (FCV 2000), pp.163-168, 2000<br \/>\n[9] \u91ce\u6d25\u5353\u4eba\u3001\u6a4b\u672c\u5468\u53f8\u3001\u6fa4\u7530\u79c0\u4e4b\uff1a\u300c\u624b\u8a71\u5358\u8a9e\u30c7\u30fc\u30bf\u30d9\u30fc\u30b9\u3068\u624b\u8a71\u3092\u5165\u529b\u3068\u3057\u305f\u691c\u7d22\u306e\u6700\u9069\u5316\u300d,<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u7b2c60\u56de\u5168\u56fd\u5927\u4f1a\u8b1b\u6f14\u8ad6\u6587\u96c6, 2000<br \/>\n<\/span><\/p>\n<p class=\"style3\"><strong><span class=\"style4\">1998\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/span><\/strong><br \/>\n<span class=\"style6\">\u3000 [1] \u6fa4\u7530\u79c0\u4e4b\u3001\u5c3e\u4e0a\u76f4\u4e4b\u3001\u6a4b\u672c\u5468\u53f8\uff1a\u300c\u30cf\u30f3\u30c9\u30b8\u30a7\u30b9\u30c1\u30e3\u5165\u529b\u30c7\u30d0\u30a4\u30b9\u306b\u3088\u308b\u97f3\u97ff\u751f\u6210\u300d<br \/>\n\u96fb\u5b50\u60c5\u5831\u901a\u4fe1\u5b66\u4f1a\u8ad6\u6587\u8a8c \uff24II\u3000Vol.J81-D-II\u3000No.5\u3000pp. 795-803<br \/>\n[2] \u6fa4\u7530\u79c0\u4e4b\u3001\u6a4b\u672c\u5468\u53f8\u3001\u677e\u5cf6\u4fca\u660e\uff1a\u300c\u904b\u52d5\u7279\u5fb4\u3068\u5f62\u72b6\u7279\u5fb4\u306b\u57fa\u3065\u3044\u305f\u30b8\u30a7\u30b9\u30c1\u30e3\u30fc\u8a8d\u8b58\u3068\u624b\u8a71\u8a8d\u8b58\u3078\u306e\u5fdc\u7528\u300d<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a\u8ad6\u6587\u8a8c Vol.39\u3000No.5 pp. 1325-1333<br \/>\n[3] \u201cHMM Based Gesture Recognition Using an Acceleration Sensor\u201d,<br \/>\nProceedings of IEEE International Workshop on Robot and Human Communication RO-MAN\u201998, pp. 500-506, 1998<br \/>\n[4] \u201cHaptic and Gesture Human-Machine Interface with Sensibility\u201d,<br \/>\nProceedings of International Conference on Quality Control by Artificial Vision QCAV\u201998, pp. 463- 468, 1998<br \/>\n[5] \u201cJapanese Sign-Language Recognition Based on Gesture Primitives Using Acceleration Sensors and Datagloves\u201d,<br \/>\nProceedings of The 2nd European Conference on Disability, Virtual Reality and Associated Technologies (ECDVRAT) \u201998, pp. 149-157, 1998<br \/>\n[6] \u300cGraspCom -\u529b\u899a\u3092\u5229\u7528\u3057\u305f\u53cc\u65b9\u5411\u5165\u51fa\u529b\u30c7\u30d0\u30a4\u30b9\u306e\u8a66\u4f5c-\u300d,<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f3\u201999 \u8ad6\u6587\u96c6 pp.201-208, 1999<br \/>\n[7] \u300c\u30de\u30eb\u30c1\u30e2\u30fc\u30c0\u30eb\u624b\u8a71\u30c7\u30fc\u30bf\u30d9\u30fc\u30b9\u3068\u624b\u8a71\u8a8d\u8b58\u3078\u306e\u5fdc\u7528\u300d,<br \/>\n\u96fb\u5b50\u60c5\u5831\u901a\u4fe1\u5b66\u4f1a \u4fe1\u5b66\u6280\u5831 HIP98-56, pp. 33-38, 1999<br \/>\n<\/span><\/p>\n<p class=\"style3\"><strong><span class=\"style4\">1997\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/span><\/strong><br \/>\n<span class=\"style6\">\u3000 [1] \u201cSound Generation Using Twiddle Interface\u201d<br \/>\nProc. of IEEE\/ASME International Conference on Advanced Intelligent Mechatronics \u201997<br \/>\n[2] \u201cSounds in Hands \u2013 A Sound Modifier Using Datagloves and Twiddle Interface \u2013 \u201c<br \/>\nProc. of International Computer Music Conference \u201997, pp.309-312<br \/>\n[3] \u201cGesture Recognition Using an Acceleration Sensor and Its Application to Musical Performance Control\u201d<br \/>\nTransactions of Electronics and Communications in Japan, Vol.80, No.5, pp.9-17<br \/>\n[4] \u300c\u63e1\u529b\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u3092\u7528\u3044\u305f\u4eee\u60f3\u697d\u5668\uff0dGraspMIDI\uff0d\u306e\u8a66\u4f5c\u300d<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f3\u201998 \u8ad6\u6587\u96c6 pp.27-28<br \/>\n[5] \u300c\u4eee\u60f3\u5171\u9cf4\u7ba1\u306b\u3088\u308b\u97f3\u97ff\u306e\u97f3\u8272\u5909\u63db\u300d<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u7b2c56\u56de\u5168\u56fd\u5927\u4f1a\u30004M-5\u3000pp.2-48 \u2013 49<br \/>\n[6] \u300c\u52a0\u901f\u5ea6\u30bb\u30f3\u30b5\u3092\u7528\u3044\u305f\uff28\uff2d\uff2d\u306b\u3088\u308b\u30b8\u30a7\u30b9\u30c1\u30e3\u30fc\u8a8d\u8b58\u300d<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u7b2c56\u56de\u5168\u56fd\u5927\u4f1a<br \/>\n<\/span><\/p>\n<p class=\"style3\"><strong><span class=\"style4\">1996\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/span><\/strong><br \/>\n<span class=\"style6\">\u3000 [1] \u300c\u58f0\u9053\u30e2\u30c7\u30eb\u306e\u6a5f\u68b0\u7cfb\u306b\u3088\u308b\u5b9f\u73fe\u3068\u305d\u306e\u8a08\u7b97\u6a5f\u5236\u5fa1\u300d<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a\u3000\u97f3\u697d\u60c5\u5831\u79d1\u5b66\u3000\u7814\u7a76\u5831\u544a 16-2 pp.7-12<br \/>\n[2]\u00a0<span class=\"style8\">Distinguished Paper Award \u53d7\u8cde<\/span><br \/>\nHideyuki Sawada and Shuji Hashimoto, \u201cAdaptive Control of a Vocal Chord and Vocal Tract for Computerized Mechanical Singing Instruments\u201d<br \/>\nProc. of International Computer Music Conference \u201996 pp.444-447<br \/>\n[3] \u201cAcceleration Sensor as an Input Device for Musical Environment\u201d<br \/>\nProc. of International Computer Music Conference \u201996 pp.421-424<br \/>\n[4] \u300c\u30d2\u30e5\u30fc\u30de\u30f3\u30a4\u30f3\u30bf\u30d5\u30a7\u30fc\u30b9\u3068\u3057\u3066\u306e\u30b8\u30a7\u30b9\u30c1\u30e3\u30fc\u8a8d\u8b58\u300d<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u30a4\u30f3\u30bf\u30e9\u30af\u30b7\u30e7\u30f3\u201997 \u8ad6\u6587\u96c6 pp.25-32<br \/>\n<\/span><\/p>\n<p class=\"style3\"><strong><span class=\"style4\">1995\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a<\/span><\/strong><br \/>\n<span class=\"style6\">\u3000 [1] \u300c\u52a0\u901f\u5ea6\u30bb\u30f3\u30b5\u3092\u7528\u3044\u305f\u30b8\u30a7\u30b9\u30c1\u30e3\u30fc\u8a8d\u8b58\u3068\u97f3\u697d\u5236\u5fa1\u3078\u306e\u5fdc\u7528\u300d<br \/>\n\u96fb\u5b50\u60c5\u5831\u901a\u4fe1\u5b66\u4f1a\u8ad6\u6587\u8a8c Vol.J79-A\u3000No.2 pp.452-459<br \/>\n[2] \u201cGesture Analysis Using 3D Acceleration Sensor for Music Control\u201d<br \/>\nProc. of International Computer Music Conference \u201995 pp.257-260<br \/>\n[3] \u201cMusical Performance System Using 3D Acceleration Sensor\u201d<br \/>\nProc. of International Conference on Multi-Media Modeling,<br \/>\n\u201cMULTIMEDIA MODELING Towards Information Superhighway\u201d, WORLD SCIENTIFIC, pp.293-306<br \/>\n[4] \u300c\u8a08\u7b97\u6a5f\u5236\u5fa1\u3055\u308c\u305f\u529b\u5b66\u7cfb\u306b\u3088\u308b\u6b4c\u58f0\u751f\u6210\u306e\u57fa\u790e\u7814\u7a76\u300d<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a \u7b2c52\u56de\uff08\u5e73\u62108\u5e74\u524d\u671f\uff09\u5168\u56fd\u5927\u4f1a\u30011-455\uff5e1-456<br \/>\n<\/span><\/p>\n<p class=\"style3\"><strong><span class=\"style4\">1994\u5e74\u5ea6\u4ee5\u524d\uff1a<\/span><\/strong><br \/>\n<span class=\"style6\">\u3000 [1] \u300cDMP\u3092\u7528\u3044\u305f\u697d\u8b5c\u81ea\u52d5\u70b9\u8a33\u30b7\u30b9\u30c6\u30e0\u300d<br \/>\n\u7b2c15\u56de\u611f\u899a\u4ee3\u884c\u30b7\u30f3\u30dd\u30b8\u30a6\u30e0\u767a\u8868\u8ad6\u6587\u96c6\u3000pp.96\uff5e102<br \/>\n[2] \u300c\u697d\u8b5c\u81ea\u52d5\u70b9\u8a33\u30b7\u30b9\u30c6\u30e0\u5b9f\u7528\u5316\u5b9f\u9a13\u5831\u544a\u300d<br \/>\n\u7b2c15\u56de\u611f\u899a\u4ee3\u884c\u30b7\u30f3\u30dd\u30b8\u30a6\u30e0\u767a\u8868\u8ad6\u6587\u96c6\u3000pp.85\uff5e91<br \/>\n[3] \u300c\u5ea7\u6a19\u81ea\u52d5\u691c\u51fa\u306b\u3088\u308b\u76f2\u4eba\u7528\u30c7\u30a3\u30b9\u30d7\u30ec\u30a4\u3000\uff0d\u70b9\u5b57\u697d\u8b5c\u8aad\u307f\u53d6\u308a\u5236\u5fa1\u3078\u306e\u5fdc\u7528\uff0d\u300d<br \/>\n\u7b2c15\u56de\u611f\u899a\u4ee3\u884c\u30b7\u30f3\u30dd\u30b8\u30a6\u30e0\u767a\u8868\u8ad6\u6587\u96c6\u3000pp.103\uff5e105<br \/>\n[4] \u201cA Practical Automated Bilateral Translation System between Printed Music and Braille\u201d<br \/>\nProc. 6th Int\u2019l Workshop on Computer Applications for the Visually Handicapped, pp.1\uff5e9<br \/>\n[5] \u300c\u697d\u8b5c\u306e\u81ea\u52d5\u70b9\u8a33\u300d<br \/>\n\u30ea\u30cf\u30d3\u30ea\u30c6\u30fc\u30b7\u30e7\u30f3\u30fb\u30a8\u30f3\u30b8\u30cb\u30a2\u30ea\u30f3\u30b0\u3000Vol5, No.2, pp.22\uff5e30, 1991<br \/>\n[6] \u300c\u97f3\u7b26\u306e\u30dd\u30c6\u30f3\u30b7\u30e3\u30eb\u3092\u8003\u616e\u3057\u305f\u697d\u8b5c\u81ea\u52d5\u4f5c\u6210\u30b7\u30b9\u30c6\u30e0\u300d<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a\u7b2c42\u56de\u5168\u56fd\u5927\u4f1a\u8b1b\u6f14\u8ad6\u6587\u96c6\u3001pp. 1-319 \uff5e 1-320<br \/>\n[7] \u300c\u70b9\u5b57\u697d\u8b5c\u5b66\u7fd2\u652f\u63f4\u30b7\u30b9\u30c6\u30e0\u300d<br \/>\n\u7b2c17\u56de\u611f\u899a\u4ee3\u884c\u30b7\u30f3\u30dd\u30b8\u30a6\u30e0\u767a\u8868\u8ad6\u6587\u96c6\u3000pp.11\uff5e15<br \/>\n[8] \u300c\u70b9\u5b57\u697d\u8b5c\u5b66\u7fd2\u652f\u63f4\u30b7\u30b9\u30c6\u30e0\u300d<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a\u7b2c44\u56de\u5168\u56fd\u5927\u4f1a\u8b1b\u6f14\u8ad6\u6587\u96c6\uff08\uff11\uff09\u3001pp.1-383\uff5e1-384<br \/>\n[9] \u300c\u97f3\u697d\u6f14\u594f\u306b\u304a\u3051\u308b\u4eba\u9593\u3068\u6a5f\u68b0\u306e\u5354\u8abf\u52d5\u4f5c\u306b\u3064\u3044\u3066\u300d<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a\u7b2c44\u56de\u5168\u56fd\u5927\u4f1a\u8b1b\u6f14\u8ad6\u6587\u96c6\uff08\uff11\uff09\u3001pp.1-389\uff5e1-390<br \/>\n[10] \u300c\u52a0\u901f\u5ea6\u30bb\u30f3\u30b5\u3092\u7528\u3044\u305f\u624b\u632f\u308a\u89e3\u6790\u3068\u5b9f\u6642\u9593\u30b5\u30a6\u30f3\u30c9\u5236\u5fa1\u300d<br \/>\n\u60c5\u5831\u51e6\u7406\u5b66\u4f1a\u7b2c50\u56de\u5168\u56fd\u5927\u4f1a\u8b1b\u6f14\u8ad6\u6587\u96c6\uff08\uff11\uff09\u3001pp.1-359\uff5e1-360<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>2016\u5e74\u5ea6\u3000\u8ad6\u6587\uff1a \u3000 [1] Junichi Danjo, Sonoko D&#8230; <a class=\"more-link\" href=\"https:\/\/www.sawada.phys.waseda.ac.jp\/?page_id=499\">Continue Reading &rarr;<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"_links":{"self":[{"href":"https:\/\/www.sawada.phys.waseda.ac.jp\/index.php?rest_route=\/wp\/v2\/pages\/499"}],"collection":[{"href":"https:\/\/www.sawada.phys.waseda.ac.jp\/index.php?rest_route=\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/www.sawada.phys.waseda.ac.jp\/index.php?rest_route=\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/www.sawada.phys.waseda.ac.jp\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.sawada.phys.waseda.ac.jp\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=499"}],"version-history":[{"count":4,"href":"https:\/\/www.sawada.phys.waseda.ac.jp\/index.php?rest_route=\/wp\/v2\/pages\/499\/revisions"}],"predecessor-version":[{"id":504,"href":"https:\/\/www.sawada.phys.waseda.ac.jp\/index.php?rest_route=\/wp\/v2\/pages\/499\/revisions\/504"}],"wp:attachment":[{"href":"https:\/\/www.sawada.phys.waseda.ac.jp\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=499"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}