<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Embedded Learning Library &#8211; 编码无悔 /  Intent &amp; Focused</title>
	<atom:link href="https://www.codelast.com/tag/embedded-learning-library/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.codelast.com</link>
	<description>最优化之路</description>
	<lastBuildDate>Mon, 27 Apr 2020 17:55:57 +0000</lastBuildDate>
	<language>zh-Hans</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>[原创] 在树莓派3上使用微软ELL嵌入式学习库(5)</title>
		<link>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%935/</link>
					<comments>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%935/#respond</comments>
		
		<dc:creator><![CDATA[learnhard]]></dc:creator>
		<pubDate>Sun, 20 Aug 2017 05:12:47 +0000</pubDate>
				<category><![CDATA[Linux]]></category>
		<category><![CDATA[Raspberry Pi/树莓派]]></category>
		<category><![CDATA[原创]]></category>
		<category><![CDATA[Embedded Learning Library]]></category>
		<category><![CDATA[Microsoft ELL]]></category>
		<category><![CDATA[Raspberry Pi]]></category>
		<category><![CDATA[机器视觉]]></category>
		<category><![CDATA[树莓派]]></category>
		<guid isPermaLink="false">https://www.codelast.com/?p=9778</guid>

					<description><![CDATA[<p>
OS： Arch Linux ARM<br />
ELL：写本文时GitHub中的最新版<br />
TensorFLow: v1.1.0</p>
<p>本文是<a href="https://www.codelast.com/?p=9609" target="_blank" rel="noopener noreferrer"><span style="background-color: rgb(255, 160, 122);">上一篇文章</span></a>的续文。<br />
微软于2017年6月底发布了一个主要用于嵌入式系统（例如，树莓派，ARM Cortex-M0等）的机器学习库<a href="https://github.com/Microsoft/ELL" target="_blank" rel="noopener noreferrer"><span style="color: rgb(0, 0, 255);"><span style="background-color: rgb(255, 160, 122);">ELL</span></span></a>（<span style="color: rgb(0, 0, 255);">Embedded Learning Library</span>，<span style="color: rgb(0, 0, 255);">嵌入式学习库</span>）。<br />
在之前的文章中，我大费周章，终于在树莓派上把ELL的demo跑起来了，但它实用吗？在本文中，我将简单地测试一下使用Darknet model的inference速度和精度。<br />
<span id="more-9778"></span><br />
<span style="background-color:#00ff00;">『1』</span>对比对象<br />
我之前在树莓派上安装过了<a href="https://github.com/samjabrahams/tensorflow-on-raspberry-pi" target="_blank" rel="noopener noreferrer"><span style="background-color: rgb(255, 160, 122);">TensorFlow on Raspberry Pi</span></a>（Inception v3 model），估计不会有哪个framework能比它更容易部署在树莓派上了&#8212;&#8212;几乎是弹指一挥间就可以完成的零成本工作。所以我就拿它来和ELL进行简单的对比。<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
<span style="background-color:#00ff00;">『2』</span>测试数据<br />
我随便找了10张常见物体的图片来做测试。有人会说这么少的样本量能说明什么问题？首先我只想简单快速地测试一下ELL的效果，其次我也没有时间去做严谨的测试。<br />
下面是10张图片，序号依次从1～10：<br />
<a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_1.jpg" style="text-align: center; width: 400px; height: 267px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_2.jpg" style="text-align: center; width: 400px; height: 533px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_3.jpg" style="text-align: center; width: 400px; height: 249px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_4.jpg" style="text-align: center; width: 400px; height: 267px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_5.jpg" style="text-align: center; width: 400px; height: 267px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_6.jpg" style="text-align: center; width: 400px; height: 275px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_7.jpg" style="text-align: center; width: 400px; height: 250px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_8.jpg" style="text-align: center; width: 400px; height: 225px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_9.jpg" style="text-align: center; width: 400px; height: 457px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_10.jpg" style="text-align: center; width: 400px; height: 225px;" /></a></p>
<p><span style="background-color:#00ff00;">『3』</span>测试结果</p>
<ul>
<li>
		速度</li>
</ul>
<p>用图片的平均inference时间来计算速度。Tensorflow的model需要<a href="https://www.codelast.com/?p=8984" target="_blank" rel="noopener noreferrer"><span style="background-color:#ffa07a;">预热</span></a>，我们在预热之后才开始真正算inference时间。同理，ELL的model加载时间也不计算在内。</p>
<table style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 0px; table-layout: fixed; border-collapse: collapse; width: 727px; font-family: sans-serif; font-size: 14px;">
<thead style="margin: 0px; border-width: 0px 0px 2px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px; background-color: rgb(247, 247, 247);">
<tr style="margin: 0px; border-width: 0px 0px 1px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px;">
<th style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				说明</th>
<th style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				TensorFlow</th>
<th style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				ELL</th>
</tr>
</thead>
<tbody style="margin: 0px; border: 0px; padding: 0px;">
<tr style="margin: 0px; border-width: 0px 0px 1px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px;">
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				一张图片的平均推断时间(秒)</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				5.477</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				1.324</td>
</tr>
</tbody>
</table>
<p><span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a>&#8230; <a href="https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%935/" class="read-more">Read More </a></p>]]></description>
										<content:encoded><![CDATA[<p>
OS： Arch Linux ARM<br />
ELL：写本文时GitHub中的最新版<br />
TensorFLow: v1.1.0</p>
<p>本文是<a href="https://www.codelast.com/?p=9609" target="_blank" rel="noopener noreferrer"><span style="background-color: rgb(255, 160, 122);">上一篇文章</span></a>的续文。<br />
微软于2017年6月底发布了一个主要用于嵌入式系统（例如，树莓派，ARM Cortex-M0等）的机器学习库<a href="https://github.com/Microsoft/ELL" target="_blank" rel="noopener noreferrer"><span style="color: rgb(0, 0, 255);"><span style="background-color: rgb(255, 160, 122);">ELL</span></span></a>（<span style="color: rgb(0, 0, 255);">Embedded Learning Library</span>，<span style="color: rgb(0, 0, 255);">嵌入式学习库</span>）。<br />
在之前的文章中，我大费周章，终于在树莓派上把ELL的demo跑起来了，但它实用吗？在本文中，我将简单地测试一下使用Darknet model的inference速度和精度。<br />
<span id="more-9778"></span><br />
<span style="background-color:#00ff00;">『1』</span>对比对象<br />
我之前在树莓派上安装过了<a href="https://github.com/samjabrahams/tensorflow-on-raspberry-pi" target="_blank" rel="noopener noreferrer"><span style="background-color: rgb(255, 160, 122);">TensorFlow on Raspberry Pi</span></a>（Inception v3 model），估计不会有哪个framework能比它更容易部署在树莓派上了&mdash;&mdash;几乎是弹指一挥间就可以完成的零成本工作。所以我就拿它来和ELL进行简单的对比。<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
<span style="background-color:#00ff00;">『2』</span>测试数据<br />
我随便找了10张常见物体的图片来做测试。有人会说这么少的样本量能说明什么问题？首先我只想简单快速地测试一下ELL的效果，其次我也没有时间去做严谨的测试。<br />
下面是10张图片，序号依次从1～10：<br />
<a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_1.jpg" style="text-align: center; width: 400px; height: 267px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_2.jpg" style="text-align: center; width: 400px; height: 533px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_3.jpg" style="text-align: center; width: 400px; height: 249px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_4.jpg" style="text-align: center; width: 400px; height: 267px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_5.jpg" style="text-align: center; width: 400px; height: 267px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_6.jpg" style="text-align: center; width: 400px; height: 275px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_7.jpg" style="text-align: center; width: 400px; height: 250px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_8.jpg" style="text-align: center; width: 400px; height: 225px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_9.jpg" style="text-align: center; width: 400px; height: 457px;" /></a></p>
<p><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><img decoding="async" alt="ELL test image" src="https://www.codelast.com/wp-content/uploads/2017/08/ell_test_image_10.jpg" style="text-align: center; width: 400px; height: 225px;" /></a></p>
<p><span style="background-color:#00ff00;">『3』</span>测试结果</p>
<ul>
<li>
		速度</li>
</ul>
<p>用图片的平均inference时间来计算速度。Tensorflow的model需要<a href="https://www.codelast.com/?p=8984" target="_blank" rel="noopener noreferrer"><span style="background-color:#ffa07a;">预热</span></a>，我们在预热之后才开始真正算inference时间。同理，ELL的model加载时间也不计算在内。</p>
<table style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 0px; table-layout: fixed; border-collapse: collapse; width: 727px; font-family: sans-serif; font-size: 14px;">
<thead style="margin: 0px; border-width: 0px 0px 2px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px; background-color: rgb(247, 247, 247);">
<tr style="margin: 0px; border-width: 0px 0px 1px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px;">
<th style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				说明</th>
<th style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				TensorFlow</th>
<th style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				ELL</th>
</tr>
</thead>
<tbody style="margin: 0px; border: 0px; padding: 0px;">
<tr style="margin: 0px; border-width: 0px 0px 1px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px;">
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				一张图片的平均推断时间(秒)</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				5.477</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				1.324</td>
</tr>
</tbody>
</table>
<p><span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
可见，ELL+Darknet model在速度上有明显优势。</p>
<ul>
<li>
		精度</li>
</ul>
<p>结果有点失望。按输出中概率最大的那个分类来算，ELL使用Darknet model在10张图片上的inference结果非常糟糕，根据人工判断，5张图片分类明显错误，其余5张算是正确&mdash;&mdash;这就有点尴尬了。而相比之下，TensorFlow使用Inception v3 model的情况则好得多，除了一张分类错误，其余9张基本上都算是得出了&ldquo;比较正确&rdquo;的结论。<br />
下面是详细的测试结果：</p>
<table style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 0px; table-layout: fixed; border-collapse: collapse; width: 727px; font-family: sans-serif; font-size: 14px;">
<thead style="margin: 0px; border-width: 0px 0px 2px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px; background-color: rgb(247, 247, 247);">
<tr style="margin: 0px; border-width: 0px 0px 1px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px;">
<th style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				图片序号</th>
<th style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				TensorFlow结果</th>
<th style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				ELL结果</th>
</tr>
</thead>
<tbody style="margin: 0px; border: 0px; padding: 0px;">
<tr style="margin: 0px; border-width: 0px 0px 1px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px;">
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				1</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				Granny Smith(澳大利亚的一种青绿色苹果)</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				banana(香蕉)</td>
</tr>
<tr style="margin: 0px; border-width: 0px 0px 1px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px;">
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				2</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				mountain bike, all-terrain bike, off-roader</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				chainlink fence(链状栅栏)</td>
</tr>
<tr style="margin: 0px; border-width: 0px 0px 1px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px;">
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				3</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				cellular telephone, cellular phone, cellphone, cell, mobile phone</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				iPod</td>
</tr>
<tr style="margin: 0px; border-width: 0px 0px 1px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px;">
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				4</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				tricycle, trike, velocipede</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				ice lolly(冰棍)</td>
</tr>
<tr style="margin: 0px; border-width: 0px 0px 1px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px;">
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				5</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				warplane, military plane</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				airliner(大型客机)</td>
</tr>
<tr style="margin: 0px; border-width: 0px 0px 1px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px;">
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				6</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				African elephant, Loxodonta africana</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				African elephant(非洲象)</td>
</tr>
<tr style="margin: 0px; border-width: 0px 0px 1px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px;">
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				7</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				hair slide(发夹)</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				piggy bank(零钱罐)</td>
</tr>
<tr style="margin: 0px; border-width: 0px 0px 1px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px;">
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				8</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				notebook, notebook computer</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				notebook(笔记本)</td>
</tr>
<tr style="margin: 0px; border-width: 0px 0px 1px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px;">
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				9</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				trolleybus, trolley coach, trackless trolley</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				trolleybus(无轨电车)</td>
</tr>
<tr style="margin: 0px; border-width: 0px 0px 1px; border-top-style: initial; border-right-style: initial; border-bottom-style: solid; border-left-style: initial; border-top-color: initial; border-right-color: initial; border-bottom-color: rgb(20, 145, 232); border-left-color: initial; border-image: initial; padding: 0px;">
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				10</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				tabby, tabby cat</td>
<td style="margin: 0px; border-width: 0px; border-style: initial; border-color: initial; padding: 5px 5px 5px 15px; line-height: 2; font-size: 12px; overflow: hidden; min-width: 50px;">
				tabby(虎斑猫)</td>
</tr>
</tbody>
</table>
<p><span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
即使inference速度再快，如此糟糕的准确度也是完全不能接受的。所以，我觉得，要改变这个结果，恐怕我只能用ELL CNTK VGG model再测试一遍了。正如我前面的文章所说，由于我的Ubuntu台式机性能低下，因此，我在台式机上编译ELL CNTK VGG model代码的时候，由于编译任务占用太多OS资源，被kernel自动kill掉了。不得已，我只能使用轻量级的Darknet model。所以在我当前条件下，暂时只能到这里为止了。</p>
<p><span style="background-color: rgb(0, 255, 0);">『4』</span>结论<br />
在ELL+Darknet model的速度确实比较快，但由于精度低的问题，ELL+Darknet model还需改进提高。<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;版权声明&nbsp;<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;<br />
转载需注明出处：<u><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><em><span style="color: rgb(0, 0, 255);"><strong style="font-size: 16px;"><span style="font-family: arial, helvetica, sans-serif;">codelast.com</span></strong></span></em></a></u>&nbsp;<br />
感谢关注我的微信公众号（微信扫一扫）：</p>
<p style="border: 0px; font-size: 13px; margin: 0px 0px 9px; outline: 0px; padding: 0px; color: rgb(77, 77, 77);">
	<img decoding="async" alt="wechat qrcode of codelast" src="https://www.codelast.com/codelast_wechat_qr_code.jpg" style="width: 200px; height: 200px;" /></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%935/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>[原创] ELL（Embedded Learning Library，微软嵌入式学习库）文章合集</title>
		<link>https://www.codelast.com/%e5%8e%9f%e5%88%9b-ell%ef%bc%88embedded-learning-library%ef%bc%8c%e5%be%ae%e8%bd%af%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%93%ef%bc%89%e6%96%87%e7%ab%a0%e5%90%88%e9%9b%86/</link>
					<comments>https://www.codelast.com/%e5%8e%9f%e5%88%9b-ell%ef%bc%88embedded-learning-library%ef%bc%8c%e5%be%ae%e8%bd%af%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%93%ef%bc%89%e6%96%87%e7%ab%a0%e5%90%88%e9%9b%86/#respond</comments>
		
		<dc:creator><![CDATA[learnhard]]></dc:creator>
		<pubDate>Sat, 12 Aug 2017 15:06:47 +0000</pubDate>
				<category><![CDATA[Linux]]></category>
		<category><![CDATA[Raspberry Pi/树莓派]]></category>
		<category><![CDATA[原创]]></category>
		<category><![CDATA[Embedded Learning Library]]></category>
		<category><![CDATA[Microsoft ELL]]></category>
		<category><![CDATA[Raspberry Pi]]></category>
		<category><![CDATA[机器视觉]]></category>
		<category><![CDATA[树莓派]]></category>
		<guid isPermaLink="false">https://www.codelast.com/?p=9754</guid>

					<description><![CDATA[<p>微软于2017年6月底发布了一个主要用于嵌入式系统（例如，树莓派，ARM Cortex-M0等）的机器学习库<a href="https://github.com/Microsoft/ELL" rel="noopener noreferrer" target="_blank"><span style="color: rgb(0, 0, 255);"><span style="background-color: rgb(255, 160, 122);">ELL</span></span></a>（<span style="color: rgb(0, 0, 255);">Embedded Learning Library</span>，<span style="color: rgb(0, 0, 255);">嵌入式学习库</span>），旨在把部分云端的机器学习计算转移到嵌入式设备上进行。<br />
本系列文章记录了在树莓派3代上把ELL demo跑起来的过程中遇到的各种问题以及解决办法。</p>
<p><span style="background-color: rgb(0, 255, 0);">➤&#160;</span><a href="https://www.codelast.com/?p=9401" rel="noopener noreferrer" target="_blank">在树莓派3上使用微软ELL嵌入式学习库(1)</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&#160;</span><a href="https://www.codelast.com/?p=9635" rel="noopener noreferrer" target="_blank">在树莓派3上使用微软ELL嵌入式学习库(2)</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&#160;</span><a href="https://www.codelast.com/?p=9673" rel="noopener noreferrer" target="_blank">在树莓派3上使用微软ELL嵌入式学习库(3)</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&#160;</span><a href="https://www.codelast.com/?p=9609" rel="noopener noreferrer" target="_blank">在树莓派3上使用微软ELL嵌入式学习库(4)</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&#160;</span><a href="https://www.codelast.com/?p=9778" rel="noopener noreferrer" target="_blank">在树莓派3上使用微软ELL嵌入式学习库(5)</a><br />
<span id="more-9754"></span><br />
<span style="background-color: rgb(0, 255, 0);">➤&#160;</span><a href="https://www.codelast.com/?p=9710" rel="noopener noreferrer" target="_blank">在树莓派3上跑ELL的demo报错：ImportError: build/_darknetReference.so: undefined symbol: cblas_sgemm</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&#160;</span><a href="https://www.codelast.com/?p=9594" rel="noopener noreferrer" target="_blank">执行ELL的demo程序cntkDemo.py时程序僵死的问题</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&#160;</span><a href="https://www.codelast.com/?p=9473" rel="noopener noreferrer" target="_blank">解决ELL demo的 OpenCV Error: Unspecified error in cvShowImage 错误</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&#160;</span><a href="https://www.codelast.com/?p=9505" rel="noopener noreferrer" target="_blank">解决编译ELL的错误：undefined reference to `cblas_xxx&#39;</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&#160;</span><a href="https://www.codelast.com/?p=9493" rel="noopener noreferrer" target="_blank">在Ubuntu 14.04系统中为ELL安装Python 3.6&#8212;&#8212;通过miniconda</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&#160;</span><a href="https://www.codelast.com/?p=9496" rel="noopener noreferrer" target="_blank">在Ubuntu 14.04中安装gcc 6</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&#160;</span><a href="https://www.codelast.com/?p=9464" rel="noopener noreferrer" target="_blank">升级Ubuntu 14.04上的Open MPI到 libmpi.so.12</a>&#8230; <a href="https://www.codelast.com/%e5%8e%9f%e5%88%9b-ell%ef%bc%88embedded-learning-library%ef%bc%8c%e5%be%ae%e8%bd%af%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%93%ef%bc%89%e6%96%87%e7%ab%a0%e5%90%88%e9%9b%86/" class="read-more">Read More </a></p>]]></description>
										<content:encoded><![CDATA[<p>微软于2017年6月底发布了一个主要用于嵌入式系统（例如，树莓派，ARM Cortex-M0等）的机器学习库<a href="https://github.com/Microsoft/ELL" rel="noopener noreferrer" target="_blank"><span style="color: rgb(0, 0, 255);"><span style="background-color: rgb(255, 160, 122);">ELL</span></span></a>（<span style="color: rgb(0, 0, 255);">Embedded Learning Library</span>，<span style="color: rgb(0, 0, 255);">嵌入式学习库</span>），旨在把部分云端的机器学习计算转移到嵌入式设备上进行。<br />
本系列文章记录了在树莓派3代上把ELL demo跑起来的过程中遇到的各种问题以及解决办法。</p>
<p><span style="background-color: rgb(0, 255, 0);">➤&nbsp;</span><a href="https://www.codelast.com/?p=9401" rel="noopener noreferrer" target="_blank">在树莓派3上使用微软ELL嵌入式学习库(1)</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&nbsp;</span><a href="https://www.codelast.com/?p=9635" rel="noopener noreferrer" target="_blank">在树莓派3上使用微软ELL嵌入式学习库(2)</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&nbsp;</span><a href="https://www.codelast.com/?p=9673" rel="noopener noreferrer" target="_blank">在树莓派3上使用微软ELL嵌入式学习库(3)</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&nbsp;</span><a href="https://www.codelast.com/?p=9609" rel="noopener noreferrer" target="_blank">在树莓派3上使用微软ELL嵌入式学习库(4)</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&nbsp;</span><a href="https://www.codelast.com/?p=9778" rel="noopener noreferrer" target="_blank">在树莓派3上使用微软ELL嵌入式学习库(5)</a><br />
<span id="more-9754"></span><br />
<span style="background-color: rgb(0, 255, 0);">➤&nbsp;</span><a href="https://www.codelast.com/?p=9710" rel="noopener noreferrer" target="_blank">在树莓派3上跑ELL的demo报错：ImportError: build/_darknetReference.so: undefined symbol: cblas_sgemm</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&nbsp;</span><a href="https://www.codelast.com/?p=9594" rel="noopener noreferrer" target="_blank">执行ELL的demo程序cntkDemo.py时程序僵死的问题</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&nbsp;</span><a href="https://www.codelast.com/?p=9473" rel="noopener noreferrer" target="_blank">解决ELL demo的 OpenCV Error: Unspecified error in cvShowImage 错误</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&nbsp;</span><a href="https://www.codelast.com/?p=9505" rel="noopener noreferrer" target="_blank">解决编译ELL的错误：undefined reference to `cblas_xxx&#39;</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&nbsp;</span><a href="https://www.codelast.com/?p=9493" rel="noopener noreferrer" target="_blank">在Ubuntu 14.04系统中为ELL安装Python 3.6&mdash;&mdash;通过miniconda</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&nbsp;</span><a href="https://www.codelast.com/?p=9496" rel="noopener noreferrer" target="_blank">在Ubuntu 14.04中安装gcc 6</a></p>
<p><span style="background-color: rgb(0, 255, 0);">➤&nbsp;</span><a href="https://www.codelast.com/?p=9464" rel="noopener noreferrer" target="_blank">升级Ubuntu 14.04上的Open MPI到 libmpi.so.12 版本</a></p>
<p><span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;版权声明&nbsp;<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;<br />
转载需注明出处：<u><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><em><span style="color: rgb(0, 0, 255);"><strong style="font-size: 16px;"><span style="font-family: arial, helvetica, sans-serif;">codelast.com</span></strong></span></em></a></u>&nbsp;<br />
感谢关注我的微信公众号（微信扫一扫）：</p>
<p style="border: 0px; font-size: 13px; margin: 0px 0px 9px; outline: 0px; padding: 0px; color: rgb(77, 77, 77);">
	<img decoding="async" alt="wechat qrcode of codelast" src="https://www.codelast.com/codelast_wechat_qr_code.jpg" style="width: 200px; height: 200px;" /></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.codelast.com/%e5%8e%9f%e5%88%9b-ell%ef%bc%88embedded-learning-library%ef%bc%8c%e5%be%ae%e8%bd%af%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%93%ef%bc%89%e6%96%87%e7%ab%a0%e5%90%88%e9%9b%86/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>[原创] 在树莓派3上使用微软ELL嵌入式学习库(4)</title>
		<link>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%934/</link>
					<comments>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%934/#respond</comments>
		
		<dc:creator><![CDATA[learnhard]]></dc:creator>
		<pubDate>Sat, 12 Aug 2017 10:46:44 +0000</pubDate>
				<category><![CDATA[Linux]]></category>
		<category><![CDATA[Raspberry Pi/树莓派]]></category>
		<category><![CDATA[原创]]></category>
		<category><![CDATA[Embedded Learning Library]]></category>
		<category><![CDATA[Microsoft ELL]]></category>
		<category><![CDATA[Raspberry Pi]]></category>
		<category><![CDATA[机器视觉]]></category>
		<category><![CDATA[树莓派]]></category>
		<guid isPermaLink="false">https://www.codelast.com/?p=9609</guid>

					<description><![CDATA[<p>
本文是<a href="https://www.codelast.com/?p=9673" rel="noopener noreferrer" target="_blank"><span style="background-color: rgb(255, 160, 122);">上一篇文章</span></a>的续文。<br />
微软于2017年6月底发布了一个主要用于嵌入式系统（例如，树莓派，ARM Cortex-M0等）的机器学习库<a href="https://github.com/Microsoft/ELL" rel="noopener noreferrer" target="_blank"><span style="color: rgb(0, 0, 255);"><span style="background-color: rgb(255, 160, 122);">ELL</span></span></a>（<span style="color: rgb(0, 0, 255);">Embedded Learning Library</span>，<span style="color: rgb(0, 0, 255);">嵌入式学习库</span>）。<br />
为了在树莓派上跑起来ELL的demo，需要先在PC上做大量工作，前几篇文章正是记录了这个过程中遇到的种种问题。<br />
从本文开始，我们终于可以把工作转到树莓派上了&#8212;&#8212;不枉前面克服的所有困难，这一刻，我们离成功是如此之近。<br />
注：<span style="color: rgb(0, 0, 255);">本文的所有操作，都是在</span><span style="color:#ff0000;">树莓派</span><span style="color: rgb(0, 0, 255);">上运行的。</span><br />
<span id="more-9609"></span></p>
<ul>
<li>
		<span style="background-color:#dda0dd;">目标及步骤</span></li>
</ul>
<p>在上一篇文章中，我在Ubuntu PC上已经编译出了一个&#160;<span style="color:#0000ff;">compiled_darknetReference_pi3</span> 目录，并且把它拷贝到了树莓派上。现在要做的，就是用这个目录下的文件，编译出一个Python module（<span style="color:#006400;">_darknetReference.so</span>），然后就可以在树莓派上跑ELL的image classification demo了。<br />
但是，在编译这个Python module之前，我们有一些准备工作要做&#8212;&#8212;安装必需的软件。<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a></p>
<ul>
<li>
		<span style="background-color:#dda0dd;">安装Python 3.4&#8212;&#8212;通过miniconda</span></li>
</ul>
<p>首先，为什么要在树莓派上安装Python 3.4呢？<br />
因为ELL提供的demo就是一个Python程序，并且它支持的Python版本是3.4。</p>
<p>在<a href="https://www.codelast.com/?p=9493" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">前面的一篇文章</span></a>中，我已经在PC Ubuntu上，通过miniconda安装过了Python 3.6，并且也解释了为什么要通过conda环境来安装Python 3.6，而不是直接在系统里装。<br />
在树莓派上，我们也是出于相同的想法，使用miniconda环境来安装Python 3.4：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &#34;Lucida Console&#34;, &#34;DejaVu Sans Mono&#34;, Monaco, &#34;Courier New&#34;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
wget http://repo.continuum.io/miniconda/Miniconda3-latest-Linux-armv7l.sh</pre>&#8230; <a href="https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%934/" class="read-more">Read More </a>]]></description>
										<content:encoded><![CDATA[<p>
本文是<a href="https://www.codelast.com/?p=9673" rel="noopener noreferrer" target="_blank"><span style="background-color: rgb(255, 160, 122);">上一篇文章</span></a>的续文。<br />
微软于2017年6月底发布了一个主要用于嵌入式系统（例如，树莓派，ARM Cortex-M0等）的机器学习库<a href="https://github.com/Microsoft/ELL" rel="noopener noreferrer" target="_blank"><span style="color: rgb(0, 0, 255);"><span style="background-color: rgb(255, 160, 122);">ELL</span></span></a>（<span style="color: rgb(0, 0, 255);">Embedded Learning Library</span>，<span style="color: rgb(0, 0, 255);">嵌入式学习库</span>）。<br />
为了在树莓派上跑起来ELL的demo，需要先在PC上做大量工作，前几篇文章正是记录了这个过程中遇到的种种问题。<br />
从本文开始，我们终于可以把工作转到树莓派上了&mdash;&mdash;不枉前面克服的所有困难，这一刻，我们离成功是如此之近。<br />
注：<span style="color: rgb(0, 0, 255);">本文的所有操作，都是在</span><span style="color:#ff0000;">树莓派</span><span style="color: rgb(0, 0, 255);">上运行的。</span><br />
<span id="more-9609"></span></p>
<ul>
<li>
		<span style="background-color:#dda0dd;">目标及步骤</span></li>
</ul>
<p>在上一篇文章中，我在Ubuntu PC上已经编译出了一个&nbsp;<span style="color:#0000ff;">compiled_darknetReference_pi3</span> 目录，并且把它拷贝到了树莓派上。现在要做的，就是用这个目录下的文件，编译出一个Python module（<span style="color:#006400;">_darknetReference.so</span>），然后就可以在树莓派上跑ELL的image classification demo了。<br />
但是，在编译这个Python module之前，我们有一些准备工作要做&mdash;&mdash;安装必需的软件。<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a></p>
<ul>
<li>
		<span style="background-color:#dda0dd;">安装Python 3.4&mdash;&mdash;通过miniconda</span></li>
</ul>
<p>首先，为什么要在树莓派上安装Python 3.4呢？<br />
因为ELL提供的demo就是一个Python程序，并且它支持的Python版本是3.4。</p>
<p>在<a href="https://www.codelast.com/?p=9493" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">前面的一篇文章</span></a>中，我已经在PC Ubuntu上，通过miniconda安装过了Python 3.6，并且也解释了为什么要通过conda环境来安装Python 3.6，而不是直接在系统里装。<br />
在树莓派上，我们也是出于相同的想法，使用miniconda环境来安装Python 3.4：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
wget http://repo.continuum.io/miniconda/Miniconda3-latest-Linux-armv7l.sh
chmod +x Miniconda3-latest-Linux-armv7l.sh
./Miniconda3-latest-Linux-armv7l.sh</pre>
<p>非常简单就装好了。<br />
注意：和前面的文章里的做法一样，在安装即将结束的时候，我允许miniconda把PATH路径添加到我的 .bashrc 文件中。</p>
<ul>
<li>
		<span style="background-color:#dda0dd;">在conda环境里安装必需的软件包</span></li>
</ul>
<p>可以通过下面的方式，来激活conda环境：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
<span style="color: rgb(255, 176, 84);">source</span> <span style="color: rgb(255, 157, 0);">~</span>/.bashrc
conda create --name py34 python=3
<span style="color: rgb(255, 176, 84);">source</span> activate py34</pre>
<p><span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
在这个conda环境中安装 NumPy 和 OpenCV：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
<span style="color: rgb(225, 239, 255);">(</span>py34<span style="color: rgb(225, 239, 255);">)</span>[root@alarmpi <span style="color: rgb(255, 157, 0);">~</span>]# conda install numpy
<span style="color: rgb(225, 239, 255);">(</span>py34<span style="color: rgb(225, 239, 255);">)</span>[root@alarmpi <span style="color: rgb(255, 157, 0);">~</span>]# conda install -c microsoft-ell opencv</pre>
<p>这个过程基本不会有什么问题和坑。</p>
<ul>
<li>
		<span style="background-color:#dda0dd;">安装OpenBLAS</span></li>
</ul>
<p>在正式开始编译Python module之前，我们事实上还需要安装一个依赖软件：OpenBLAS，如果你是Rabpbian系统，那么通过 apt-get install 就可以安装，但不巧，我用的是Arch Linux ARM系统，没有OpenBLAS可以直接通过 pacman -S（相当于Rabpbian的 apt-get install） 安装，为此，我在Arch Linux ARM上搜了一下：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
[root@alarmpi <span style="color: rgb(255, 157, 0);">~</span>]# pacman -Ss blas
extra/blas 3.7.1-1
    Basic Linear Algebra Subprograms
extra/cblas 3.7.1-1
    C interface to BLAS
extra/liblastfm 1.0.9-2
    A Qt4 C++ library <span style="color: rgb(255, 157, 0);">for</span> <span style="color: rgb(204, 204, 204);">the</span> Last.fm webservices</pre>
<p>相关的package一共就两个：blas和cblas，如果安装了cblas，会自动把blas也装上，因为cblas依赖于blas。<br />
我到底该装哪个呢？没头绪。于是我把blas和cblas都装了，试试看后面的步骤能不能走下去吧！<br />
（但是我想提前剧透一下，这样做后面的路是走不通的，如果不想等的话可以直接看<a href="https://www.codelast.com/?p=9710" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">这个</span></a>链接里的解决方案）<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a></p>
<ul>
<li>
		<span style="background-color:#dda0dd;">安装其他依赖软件</span></li>
</ul>
<p>除了OpenBLAS，还有其他一些依赖软件要安装，它们都是在我跑demo程序挂了之后，根据错误信息发现的，在你的树莓派上，可能你已经装过了，或者在装其他的软件时把依赖也带进去了：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
pacman -S ffmpeg2.8 libpng12</pre>
<p></p>
<ul>
<li>
		<span style="background-color:#dda0dd;">编译Python module</span></li>
</ul>
<p>还记得我们从PC上拷贝过来的&nbsp;compiled_darknetReference_pi3 目录吧？现在要进到这个目录下编译了。需要特别注意的是：编译<span style="color:#ff0000;">不要</span>在miniconda环境下进行。</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
<span style="color: rgb(255, 176, 84);">cd</span> compiled_darknetReference_pi3
mkdir build
<span style="color: rgb(255, 176, 84);">cd</span> build
cmake ..
make</pre>
<p>如果一切顺利，就会在build目录下编译出一个30M大的<span style="color:#006400;">_darknetReference.so</span>文件了。<br />
但这个Python module能work吗？不试你是不知道的，于是跑ELL的demo来测试一下。</p>
<ul>
<li>
		<span style="background-color:#dda0dd;">测试Python module&mdash;&mdash;跑ELL的demo</span></li>
</ul>
<p>激活miniconda环境，把USB摄像头插到树莓派的USB接口上，然后run ELL的demo：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
<span style="color: rgb(225, 239, 255);">(</span>py34<span style="color: rgb(225, 239, 255);">)</span>[root@alarmpi <span style="color: rgb(255, 157, 0);">~</span>]# <span style="color: rgb(255, 176, 84);">cd</span> compiled_darknetReference_pi3
<span style="color: rgb(225, 239, 255);">(</span>py34<span style="color: rgb(225, 239, 255);">)</span>[root@alarmpi <span style="color: rgb(255, 157, 0);">~</span>]# python compiledDarknetDemo.py</pre>
<p>如果你的运气真的特别好，那么这个demo就能跑起来。<br />
在这里，我要特别提一下，我的树莓派OS Arch Linux ARM是没有安装图形界面的，所以在command line下跑上面的demo，会直接报错：</p>
<blockquote>
<p>
		(frame:12648): Gtk-WARNING **: cannot open display:</p>
</blockquote>
<div>
	这个程序里调用了图形界面相关的功能，所以跑不起来是正常的。和PC上的demo程序一样，我对这个程序也做了<a href="https://www.codelast.com/?p=9594" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">类似的改造</span></a>，去掉GUI相关的功能，只把predict结果打印到command line。<br />
	<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
	然而就算是做了这样的改造，demo又能跑起来了吗？<br />
	我没那么好的运气。运行 python compiledDarknetDemo.py 之后还是报错：</div>
<blockquote>
<div>
		ImportError: build/_darknetReference.so: undefined symbol: cblas_sgemm</div>
</blockquote>
<div>
	这个问题的根本原因在于：用pacman -S cblas安装的blas不能用&mdash;&mdash;前面已经提前剧透过了。这个问题的解决办法请看<a href="https://www.codelast.com/?p=9710" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">这个</span></a>链接。</p>
<p>	至此，解决了最后这个问题之后，我终于可以在树莓派上把darknet的demo跑起来了，部署ELL到树莓派上的整个过程真的太麻烦了&mdash;&mdash;这主要是各种依赖软件的版本问题导致的。</p>
<p>	在下一篇文章中，我们会简单地看一下ELL在树莓派上的实用性如何&mdash;&mdash;速度以及精度等。<br />
	<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
	<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;版权声明&nbsp;<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;<br />
	转载需注明出处：<u><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><em><span style="color: rgb(0, 0, 255);"><strong style="font-size: 16px;"><span style="font-family: arial, helvetica, sans-serif;">codelast.com</span></strong></span></em></a></u>&nbsp;<br />
	感谢关注我的微信公众号（微信扫一扫）：</p>
<p style="border: 0px; font-size: 13px; margin: 0px 0px 9px; outline: 0px; padding: 0px; color: rgb(77, 77, 77);">
		<img decoding="async" alt="wechat qrcode of codelast" src="https://www.codelast.com/codelast_wechat_qr_code.jpg" style="width: 200px; height: 200px;" /></p>
</div>
]]></content:encoded>
					
					<wfw:commentRss>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%934/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>[原创] 在树莓派3上跑ELL的demo报错：ImportError: build/_darknetReference.so: undefined symbol: cblas_sgemm</title>
		<link>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e8%b7%91ell%e7%9a%84demo%e6%8a%a5%e9%94%99%ef%bc%9aimporterror-build_darknetreference-so-undefined-symbol-cblas_sgemm/</link>
					<comments>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e8%b7%91ell%e7%9a%84demo%e6%8a%a5%e9%94%99%ef%bc%9aimporterror-build_darknetreference-so-undefined-symbol-cblas_sgemm/#respond</comments>
		
		<dc:creator><![CDATA[learnhard]]></dc:creator>
		<pubDate>Fri, 11 Aug 2017 11:30:18 +0000</pubDate>
				<category><![CDATA[Linux]]></category>
		<category><![CDATA[Raspberry Pi/树莓派]]></category>
		<category><![CDATA[原创]]></category>
		<category><![CDATA[Embedded Learning Library]]></category>
		<category><![CDATA[Microsoft ELL]]></category>
		<category><![CDATA[Raspberry Pi]]></category>
		<category><![CDATA[机器视觉]]></category>
		<category><![CDATA[树莓派]]></category>
		<guid isPermaLink="false">https://www.codelast.com/?p=9710</guid>

					<description><![CDATA[<p>
OS：Arch Linux ARM<br />
gcc version：7.1.1 20170516 (GCC)</p>
<p>微软于2017年6月底发布了一个主要用于嵌入式系统（例如，树莓派，ARM Cortex-M0等）的机器学习库<a href="https://github.com/Microsoft/ELL" target="_blank" rel="noopener noreferrer"><span style="color: rgb(0, 0, 255);"><span style="background-color: rgb(255, 160, 122);">ELL</span></span></a>（<span style="color: rgb(0, 0, 255);">Embedded Learning Library</span>，<span style="color: rgb(0, 0, 255);">嵌入式学习库</span>）。<br />
本文主要介绍了在树莓派上跑ELL的demo程序时，遇到的一个&#8220;<span style="color:#b22222;">undefined symbol: cblas_sgemm</span>&#8221;问题的解决办法。<br />
<span id="more-9710"></span><br />
当我们一切准备工作已经基本完成，在树莓派上跑ELL的demo程序时，可能会报这个错：</p>
<blockquote>
<div>
		(py34)[root@alarmpi compiled_darknetReference_pi3]# python compiledDarknetDemo.py</div>
<div>
		Traceback (most recent call last):</div>
<div>
		File &#34;/root/raspberry-pi/ai/ell-related/compiled_darknetReference_pi3/darknetReference.py&#34;, line 14, in swig_import_helper</div>
<div>
		return importlib.import_module(mname)</div>
<div>
		File &#34;/root/.miniconda3/envs/py34/lib/python3.4/importlib/init.py&#34;,</div></blockquote>&#8230; <a href="https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e8%b7%91ell%e7%9a%84demo%e6%8a%a5%e9%94%99%ef%bc%9aimporterror-build_darknetreference-so-undefined-symbol-cblas_sgemm/" class="read-more">Read More </a>]]></description>
										<content:encoded><![CDATA[<p>
OS：Arch Linux ARM<br />
gcc version：7.1.1 20170516 (GCC)</p>
<p>微软于2017年6月底发布了一个主要用于嵌入式系统（例如，树莓派，ARM Cortex-M0等）的机器学习库<a href="https://github.com/Microsoft/ELL" target="_blank" rel="noopener noreferrer"><span style="color: rgb(0, 0, 255);"><span style="background-color: rgb(255, 160, 122);">ELL</span></span></a>（<span style="color: rgb(0, 0, 255);">Embedded Learning Library</span>，<span style="color: rgb(0, 0, 255);">嵌入式学习库</span>）。<br />
本文主要介绍了在树莓派上跑ELL的demo程序时，遇到的一个&ldquo;<span style="color:#b22222;">undefined symbol: cblas_sgemm</span>&rdquo;问题的解决办法。<br />
<span id="more-9710"></span><br />
当我们一切准备工作已经基本完成，在树莓派上跑ELL的demo程序时，可能会报这个错：</p>
<blockquote>
<div>
		(py34)[root@alarmpi compiled_darknetReference_pi3]# python compiledDarknetDemo.py</div>
<div>
		Traceback (most recent call last):</div>
<div>
		File &quot;/root/raspberry-pi/ai/ell-related/compiled_darknetReference_pi3/darknetReference.py&quot;, line 14, in swig_import_helper</div>
<div>
		return importlib.import_module(mname)</div>
<div>
		File &quot;/root/.miniconda3/envs/py34/lib/python3.4/importlib/init.py&quot;, line 109, in import_module</div>
<div>
		return _bootstrap._gcd_import(name[level:], package, level)</div>
<div>
		File &quot;&quot;, line 2254, in _gcd_import</div>
<div>
		File &quot;&quot;, line 2237, in _find_and_load</div>
<div>
		File &quot;&quot;, line 2226, in _find_and_load_unlocked</div>
<div>
		File &quot;&quot;, line 1191, in _load_unlocked</div>
<div>
		File &quot;&quot;, line 1161, in _load_backward_compatible</div>
<div>
		File &quot;&quot;, line 539, in _check_name_wrapper</div>
<div>
		File &quot;&quot;, line 1715, in load_module</div>
<div>
		File &quot;&quot;, line 321, in _call_with_frames_removed</div>
<div>
		<span style="color:#ff0000;">ImportError: build/_darknetReference.so: undefined symbol: cblas_sgemm</span></div>
<div>
		&nbsp;</div>
<div>
		During handling of the above exception, another exception occurred:</div>
<div>
		&nbsp;</div>
<div>
		Traceback (most recent call last):</div>
<div>
		File &quot;compiledDarknetDemo.py&quot;, line 11, in&nbsp;</div>
<div>
		import darknetReference as model</div>
<div>
		File &quot;/root/raspberry-pi/ai/ell-related/compiled_darknetReference_pi3/darknetReference.py&quot;, line 17, in&nbsp;</div>
<div>
		_darknetReference = swig_import_helper()</div>
<div>
		File &quot;/root/raspberry-pi/ai/ell-related/compiled_darknetReference_pi3/darknetReference.py&quot;, line 16, in swig_import_helper</div>
<div>
		return importlib.import_module(&#39;_darknetReference&#39;)</div>
<div>
		File &quot;/root/.miniconda3/envs/py34/lib/python3.4/importlib/init.py&quot;, line 109, in import_module</div>
<div>
		return _bootstrap._gcd_import(name[level:], package, level)</div>
<div>
		<span style="color:#ff0000;">ImportError: build/_darknetReference.so: undefined symbol: cblas_sgemm</span></div>
</blockquote>
<div>
	<br />
	问题的核心在于标红的那两句。这说明我们在树莓派上编译出来的Python module&nbsp;<span style="color:#0000ff;">_darknetReference.so</span>，在运行时找不到&nbsp;<span style="color:#b22222;">cblas_sgemm</span> 这个函数，这个函数其实是应该在blas库里定义的。<br />
	<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
	这说明我用 pacman -S cblas blas 安装的blas库不能用。<br />
	在Arch Linux ARM系统上，检索到的和blas相关的package就只有如下几个：</p>
<pre style="font-size: 0.9333em; width: 828.906px; background: rgb(0, 34, 64); margin-top: 0px; margin-bottom: 0px; font-stretch: normal; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; color: rgb(255, 255, 255);">
[root@alarmpi <span style="color: rgb(255, 157, 0);">~</span>]# pacman -Ss blas
extra/blas 3.7.1-1
    Basic Linear Algebra Subprograms
extra/cblas 3.7.1-1
    C interface to BLAS
extra/liblastfm 1.0.9-2
    A Qt4 C++ library <span style="color: rgb(255, 157, 0);">for</span> <span style="color: rgb(204, 204, 204);">the</span> Last.fm webservices</pre>
<p>
	我试验过，<span style="color:#b22222;">无论是单独安装 blas，还是安装 cblas，或者是两个一起装，最终都不能解决问题</span>。<br />
	下面的这段message，就是在我同时安装了 blas 和 cblas 之后，编译<span style="color: rgb(0, 0, 255);">_darknetReference.so</span>过程中的 cmake 输出：</p>
<blockquote>
<div>
			[root@alarmpi build]# cmake ..</div>
<div>
			-- The C compiler identification is GNU 7.1.1</div>
<div>
			-- The CXX compiler identification is GNU 7.1.1</div>
<div>
			-- Check for working C compiler: /usr/bin/cc</div>
<div>
			-- Check for working C compiler: /usr/bin/cc -- works</div>
<div>
			-- Detecting C compiler ABI info</div>
<div>
			-- Detecting C compiler ABI info - done</div>
<div>
			-- Detecting C compile features</div>
<div>
			-- Detecting C compile features - done</div>
<div>
			-- Check for working CXX compiler: /usr/bin/c++</div>
<div>
			-- Check for working CXX compiler: /usr/bin/c++ -- works</div>
<div>
			-- Detecting CXX compiler ABI info</div>
<div>
			-- Detecting CXX compiler ABI info - done</div>
<div>
			-- Detecting CXX compile features</div>
<div>
			-- Detecting CXX compile features - done</div>
<div>
			-- Looking for pthread.h</div>
<div>
			-- Looking for pthread.h - found</div>
<div>
			-- Looking for pthread_create</div>
<div>
			-- Looking for pthread_create - not found</div>
<div>
			-- Looking for pthread_create in pthreads</div>
<div>
			-- Looking for pthread_create in pthreads - not found</div>
<div>
			-- Looking for pthread_create in pthread</div>
<div>
			-- Looking for pthread_create in pthread - found</div>
<div>
			-- Found Threads: TRUE &nbsp;</div>
<div>
			<span style="color:#0000ff;">-- Blas libraries: /usr/lib/libblas.so</span></div>
<div>
			-- Blas linker flags:&nbsp;</div>
<div>
			-- Blas include directories:&nbsp;</div>
<div>
			<span style="color:#0000ff;">-- Using BLAS include path: /usr/include</span></div>
<div>
			<span style="color:#0000ff;">-- Using BLAS library: /usr/lib/libblas.so</span></div>
<div>
			-- Using BLAS DLLs:&nbsp;</div>
<div>
			-- Found PythonInterp: /usr/bin/python3 (found suitable version &quot;3.6.2&quot;, minimum required is &quot;3.4&quot;)&nbsp;</div>
<div>
			-- Found PythonLibs: /usr/lib/libpython3.6m.so (found suitable version &quot;3.6.2&quot;, minimum required is &quot;3.4&quot;)&nbsp;</div>
<div>
			-- Configuring done</div>
<div>
			-- Generating done</div>
<div>
			-- Build files have been written to: /root/raspberry-pi/ai/ell-related/compiled_darknetReference_pi3/build</div>
</blockquote>
<div>
		注意标蓝的那几句，看似blas依赖都找到了，编译过程也可以成功地执行完，然而编译出来的<span style="color: rgb(0, 0, 255);">_darknetReference.so</span>却不能用。<br />
		奇怪的是，我检查过了安装好的 blas 库文件（.so），它确实是带了&nbsp;cblas_sgemm 函数的，所以为什么编译出来的Python module不能work，我没搞明白：</p>
<blockquote>
<div>
				nm -D /usr/lib/libblas.so | grep cblas_sgemm</div>
<div>
				(输出不为空，例如 &quot;0005f230 T cblas_sgemm&quot;)</div>
</blockquote>
<p></p>
<div>
			<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
			我还试验过只安装 blas，不安装 cblas，那么在 cmake 的输出中，下面这两行你是看不到的：</div>
<blockquote>
<div>
				-- Using BLAS include path: /usr/include</div>
<div>
				-- Using BLAS library: /usr/lib/libblas.so</div>
</blockquote>
<div>
			取而代之的是：</div>
<blockquote>
<div>
				-- Blas include directories:&nbsp;</div>
<div>
				-- BLAS library not found</div>
</blockquote>
<div>
			这说明cmake连blas的依赖都没有找到，显然是不行的。所以不用想也知道结果了：编译出来的<span style="color: rgb(0, 0, 255);">_darknetReference.so</span>也不能用。</div>
</p></div>
<p>
	经历了以上失败的过程，我一度很困惑，经过一番探索，最终找到了可行的解决方案：<span style="color:#b22222;">自己编译安装OpenBLAS，并且在编译ELL的Python module的时候对编译过程略作修改，从而让编译脚本找到正确的blas库</span>。<br />
	<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
	二话不说我们马上开始干活：</p>
<ul>
<li>
			<span style="background-color:#dda0dd;">下载OpenBLAS源码 &amp; 编译</span></li>
</ul>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
git clone https://github.com/xianyi/OpenBLAS
<span style="color: rgb(255, 176, 84);">cd</span> OpenBLAS
make</pre>
<p>	最后一段输出信息如下：</p>
<blockquote>
<div>
			......</div>
<div>
			make[1]: Leaving directory &#39;/root/resource/OpenBLAS/exports&#39;</div>
<div>
			&nbsp;</div>
<div>
			&nbsp;OpenBLAS build complete. (BLAS CBLAS)&nbsp;</div>
<div>
			&nbsp;</div>
<div>
			&nbsp; OS &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; ... Linux &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;</div>
<div>
			&nbsp; Architecture &nbsp; &nbsp; ... arm &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;</div>
<div>
			&nbsp; BINARY &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; ... 32bit &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;</div>
<div>
			&nbsp; C compiler &nbsp; &nbsp; &nbsp; ... GCC &nbsp;(command line : gcc) &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;</div>
<div>
			&nbsp; Library Name &nbsp; &nbsp; ... libopenblas_armv7p-r0.3.0.dev.a (Multi threaded; Max num-threads is 4)</div>
<div>
			&nbsp;</div>
<div>
			To install the library, you can run &quot;make PREFIX=/path/to/your/installation install&quot;.</div>
</blockquote>
<p>	最后一句提示我们，可以通过 <span style="color:#0000ff;">make PREFIX=路径 install</span>&nbsp;的方式，把编译好的OpenBLAS安装到指定的路径下。</p>
<div>
		<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a></p>
<ul>
<li>
				<span style="background-color:#dda0dd;">安装OpenBlas到自定义的目录下</span></li>
</ul>
<p>		我不想搞乱系统目录，所以就指定了安装目录：</p>
<blockquote>
<div>
				[root@alarmpi OpenBLAS]# make PREFIX=/usr/lib/openblas install</div>
<div>
				make -j 4 -f Makefile.install install &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;</div>
<div>
				make[1]: Entering directory &#39;/root/resource/OpenBLAS&#39; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;</div>
<div>
				Generating openblas_config.h in /usr/lib/openblas/include &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;</div>
<div>
				Generating f77blas.h in /usr/lib/openblas/include &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;</div>
<div>
				Generating cblas.h in /usr/lib/openblas/include &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;</div>
<div>
				Copying the static library to /usr/lib/openblas/lib &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;</div>
<div>
				Copying the shared library to /usr/lib/openblas/lib &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;</div>
<div>
				Generating openblas.pc in /usr/lib/openblas/lib/pkgconfig &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;</div>
<div>
				Generating OpenBLASConfig.cmake in /usr/lib/openblas/lib/cmake/openblas</div>
<div>
				Generating OpenBLASConfigVersion.cmake in /usr/lib/openblas/lib/cmake/openblas</div>
<div>
				Install OK! &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;</div>
<div>
				make[1]: Leaving directory &#39;/root/resource/OpenBLAS&#39;</div>
</blockquote>
<ul>
<li>
				<span style="background-color:#dda0dd;">重新编译_darknetReference.so的一些准备工作</span></li>
</ul>
<div>
			（1）卸载之前安装的 blas、cblas</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
pacman -R blas cblas</pre>
<p>
			（2）把OpenBlas的lib路径添加到LD_LIBRARY_PATH中</div>
<div>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
[root@alarmpi build]# <span style="color: rgb(255, 238, 128);">export</span> LD_LIBRARY_PATH=<span style="color: rgb(204, 204, 204);"><span style="color: rgb(225, 239, 255);">$</span>LD_LIBRARY_PATH</span>:/usr/lib/openblas/lib</pre>
<p>			<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" target="_blank" rel="noopener noreferrer"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
			（3）修改 <span style="color:#006400;">compiled_darknetReference_pi3/OpenBLASSetup.cmake</span> 文件<br />
			这个文件定义了如何找到 OpenBLAS 的include头文件以及.so文件，所以我把路径&nbsp;<span style="color:#0000ff;">/usr/lib/openblas/include/</span> 添加到blas的search路径中：</p>
<blockquote>
<div>
					set(BLAS_INCLUDE_SEARCH_PATHS</div>
<div>
					&nbsp; &nbsp; /System/Library/Frameworks/Accelerate.framework/Versions/Current/Frameworks/vecLib.framework/Versions/Current/Headers/</div>
<div>
					&nbsp; &nbsp; /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk/System/Library/Frameworks/Accelerate.framework/Frameworks/vecLib.framework/Headers/</div>
<div>
					&nbsp; &nbsp; /usr/include</div>
<div>
					&nbsp; &nbsp; /usr/local/include</div>
<div>
					<span style="color:#0000ff;">&nbsp; &nbsp; /usr/lib/openblas/include</span></div>
<div>
					)</div>
</blockquote>
<div>
				蓝色那一句是我加的。</p>
<ul>
<li>
						<span style="background-color:#dda0dd;">重新编译_darknetReference.so</span></li>
</ul>
<p>				cmake的输出大部分与之前相同，不同的是这几句：</p>
<blockquote>
<div>
						-- Blas libraries: /usr/lib/openblas/lib/libopenblas.so</div>
<div>
						-- Blas linker flags:</div>
<div>
						-- Blas include directories:</div>
<div>
						-- Using BLAS include path: /usr/lib/openblas/include</div>
<div>
						-- Using BLAS library: /usr/lib/openblas/lib/libopenblas.so</div>
</blockquote>
<div>
					找到的blas路径都是我安装的OpenBLAS路径，可见以上修改真的生效了。</p>
<p>					最后在miniconda环境下用 <span style="color:#0000ff;">python compiledDarknetDemo.py</span> 测试，发现编译出来的Python module果然work了，问题解决！<br />
					<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
					<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;版权声明&nbsp;<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;<br />
					转载需注明出处：<u><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><em><span style="color: rgb(0, 0, 255);"><strong style="font-size: 16px;"><span style="font-family: arial, helvetica, sans-serif;">codelast.com</span></strong></span></em></a></u>&nbsp;<br />
					感谢关注我的微信公众号（微信扫一扫）：</p>
<p style="border: 0px; font-size: 13px; margin: 0px 0px 9px; outline: 0px; padding: 0px; color: rgb(77, 77, 77);">
						<img decoding="async" alt="wechat qrcode of codelast" src="https://www.codelast.com/codelast_wechat_qr_code.jpg" style="width: 200px; height: 200px;" /></p>
</p></div>
</p></div>
</p></div>
</p></div>
</div>
]]></content:encoded>
					
					<wfw:commentRss>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e8%b7%91ell%e7%9a%84demo%e6%8a%a5%e9%94%99%ef%bc%9aimporterror-build_darknetreference-so-undefined-symbol-cblas_sgemm/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>[原创] 在树莓派3上使用微软ELL嵌入式学习库(3)</title>
		<link>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%933/</link>
					<comments>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%933/#respond</comments>
		
		<dc:creator><![CDATA[learnhard]]></dc:creator>
		<pubDate>Tue, 08 Aug 2017 16:01:01 +0000</pubDate>
				<category><![CDATA[Linux]]></category>
		<category><![CDATA[Raspberry Pi/树莓派]]></category>
		<category><![CDATA[原创]]></category>
		<category><![CDATA[Embedded Learning Library]]></category>
		<category><![CDATA[Microsoft ELL]]></category>
		<category><![CDATA[Raspberry Pi]]></category>
		<category><![CDATA[机器视觉]]></category>
		<category><![CDATA[树莓派]]></category>
		<guid isPermaLink="false">https://www.codelast.com/?p=9673</guid>

					<description><![CDATA[<p>
本文是<a href="https://www.codelast.com/?p=9635" rel="noopener noreferrer" target="_blank"><span style="background-color: rgb(255, 160, 122);">上一篇文章</span></a>的续文。<br />
微软于2017年6月底发布了一个主要用于嵌入式系统（例如，树莓派，ARM Cortex-M0等）的机器学习库<a href="https://github.com/Microsoft/ELL" rel="noopener noreferrer" target="_blank"><span style="color: rgb(0, 0, 255);"><span style="background-color: rgb(255, 160, 122);">ELL</span></span></a>（<span style="color: rgb(0, 0, 255);">Embedded Learning Library</span>，<span style="color: rgb(0, 0, 255);">嵌入式学习库</span>）。<br />
在前几篇文章中，我在Ubuntu PC上对ELL里现成可用的model进行了测试，虽然由于我台式机太老旧的原因，运行速度相当之慢，不过它终究跑通了demo，下一步，我们需要在台式机上，把ELL的model编译到目标平台上&#8212;&#8212;在这里，目标平台指的就是树莓派3。<br />
注：<span style="color: rgb(0, 0, 255);">本文的所有操作，都是在台式机上运行的。</span><br />
<span id="more-9673"></span></p>
<ul>
<li>
		<span style="background-color:#dda0dd;">为目标平台（树莓派）编译代码</span></li>
</ul>
<p>ELL自带了一个非常酷的神经网络模型编译器，它可以为目标平台生成高度优化的代码，从而可以让model在目标平台上以很快的速度运行。<br />
在前面运行 CNTK demo的时候，细心的你一定会发现，demo运行的同时，在&#160;ELL/build/tutorials/vision/gettingStarted/ 目录下生成了一个&#160;vgg16ImageNet.map 文件（如果你使用的是Darknet，生成的就是&#160;darknetReference.map），这个文件是ELL格式的model文件，之后会用到。<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
所以二话不说，直接开始编译：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &#34;Lucida Console&#34;, &#34;DejaVu Sans Mono&#34;, Monaco, &#34;Courier New&#34;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
<span style="color: rgb(255, 176, 84);">cd</span> ELL/build
make compiled_vgg16ImageNet_pi3</pre>
<p>我以为这个过程会很顺利，结果呢？我试了很多次，每一次编译经过很长时间的等待之后，都以失败告终，错误信息大概类似于下面这样：</p>
<blockquote>
<div>
		(前面还有很多，省略)</div>
<div>
		[ 85%] Built target trainers</div>
<div>
		[ 96%] Built target common</div>
<div>
		[ 98%] Built target compile</div>
<div>
		[ 98%] Generating /home/codelast/programme/pi/ELL/build/tutorials/vision/gettingStarted/compiled_vgg16ImageNet_pi3/vgg16ImageNet.ll;/home/codelast/programme/pi/ELL/build/tutorials/vision/gettingStarted/compiled_vgg16ImageNet_pi3/vgg16ImageNet.i;/home/codelast/programme/pi/ELL/build/tutorials/vision/gettingStarted/compiled_vgg16ImageNet_pi3/vgg16ImageNet.i.h</div></blockquote>&#8230; <a href="https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%933/" class="read-more">Read More </a>]]></description>
										<content:encoded><![CDATA[<p>
本文是<a href="https://www.codelast.com/?p=9635" rel="noopener noreferrer" target="_blank"><span style="background-color: rgb(255, 160, 122);">上一篇文章</span></a>的续文。<br />
微软于2017年6月底发布了一个主要用于嵌入式系统（例如，树莓派，ARM Cortex-M0等）的机器学习库<a href="https://github.com/Microsoft/ELL" rel="noopener noreferrer" target="_blank"><span style="color: rgb(0, 0, 255);"><span style="background-color: rgb(255, 160, 122);">ELL</span></span></a>（<span style="color: rgb(0, 0, 255);">Embedded Learning Library</span>，<span style="color: rgb(0, 0, 255);">嵌入式学习库</span>）。<br />
在前几篇文章中，我在Ubuntu PC上对ELL里现成可用的model进行了测试，虽然由于我台式机太老旧的原因，运行速度相当之慢，不过它终究跑通了demo，下一步，我们需要在台式机上，把ELL的model编译到目标平台上&mdash;&mdash;在这里，目标平台指的就是树莓派3。<br />
注：<span style="color: rgb(0, 0, 255);">本文的所有操作，都是在台式机上运行的。</span><br />
<span id="more-9673"></span></p>
<ul>
<li>
		<span style="background-color:#dda0dd;">为目标平台（树莓派）编译代码</span></li>
</ul>
<p>ELL自带了一个非常酷的神经网络模型编译器，它可以为目标平台生成高度优化的代码，从而可以让model在目标平台上以很快的速度运行。<br />
在前面运行 CNTK demo的时候，细心的你一定会发现，demo运行的同时，在&nbsp;ELL/build/tutorials/vision/gettingStarted/ 目录下生成了一个&nbsp;vgg16ImageNet.map 文件（如果你使用的是Darknet，生成的就是&nbsp;darknetReference.map），这个文件是ELL格式的model文件，之后会用到。<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
所以二话不说，直接开始编译：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
<span style="color: rgb(255, 176, 84);">cd</span> ELL/build
make compiled_vgg16ImageNet_pi3</pre>
<p>我以为这个过程会很顺利，结果呢？我试了很多次，每一次编译经过很长时间的等待之后，都以失败告终，错误信息大概类似于下面这样：</p>
<blockquote>
<div>
		(前面还有很多，省略)</div>
<div>
		[ 85%] Built target trainers</div>
<div>
		[ 96%] Built target common</div>
<div>
		[ 98%] Built target compile</div>
<div>
		[ 98%] Generating /home/codelast/programme/pi/ELL/build/tutorials/vision/gettingStarted/compiled_vgg16ImageNet_pi3/vgg16ImageNet.ll;/home/codelast/programme/pi/ELL/build/tutorials/vision/gettingStarted/compiled_vgg16ImageNet_pi3/vgg16ImageNet.i;/home/codelast/programme/pi/ELL/build/tutorials/vision/gettingStarted/compiled_vgg16ImageNet_pi3/vgg16ImageNet.i.h</div>
<div>
		OpenBLAS : Your OS does not support AVX instructions. OpenBLAS is using Nehalem kernels as a fallback, which may give poorer performance.</div>
<div>
		[100%] Compiling vgg16ImageNet.ll to /home/codelast/programme/pi/ELL/build/tutorials/vision/gettingStarted/compiled_vgg16ImageNet_pi3/vgg16ImageNet.o for pi3</div>
<div>
		<span style="color:#ff0000;">Killed</span></div>
<div>
		make[3]: *** [tutorials/vision/gettingStarted/compiled_vgg16ImageNet_pi3/vgg16ImageNet.o] Error 137</div>
<div>
		make[3]: *** Deleting file `tutorials/vision/gettingStarted/compiled_vgg16ImageNet_pi3/vgg16ImageNet.o&#39;</div>
<div>
		make[2]: *** [tutorials/vision/gettingStarted/CMakeFiles/compiled_vgg16ImageNet_pi3.dir/all] Error 2</div>
<div>
		make[1]: *** [tutorials/vision/gettingStarted/CMakeFiles/compiled_vgg16ImageNet_pi3.dir/rule] Error 2</div>
<div>
		make: *** [compiled_vgg16ImageNet_pi3] Error 2</div>
</blockquote>
<div>
	注意有一个&ldquo;<span style="color:#ff0000;">Killed</span>&rdquo;，这说明由于编译的过程占用了太多资源，进程被我OS的kernel杀掉了。我说过我用的是一台性能非常差的Ubuntu PC，所以硬件条件限制，遇到这种情况在所难免。<br />
	<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a></p>
<ul>
<li>
			<span style="background-color:#dda0dd;">换Darknet model</span></li>
</ul>
<p>	现在摆在我面前的路有两条：1) 换一台高性能的PC来尝试编译；2) 把CNTK换成轻量级的Darknet。<br />
	ELL的开发者建议使用轻量级的Darknet，而不是用CNTK。所以我就把前面几篇文章中CNTK的测试流程，换成Darknet又走了一遍。<br />
	重新再来一遍，会发现其实非常顺手和简单了：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
<span style="color: rgb(255, 176, 84);">cd</span> build/tutorials/vision/gettingStarted
curl -O https://raw.githubusercontent.com/pjreddie/darknet/master/cfg/darknet.cfg
curl -O https://pjreddie.com/media/files/darknet.weights</pre>
<p>	这样就下载好了Darknet model，你会看到这个model的大小只有28M，而CNTK model（VGG16_ImageNet_Caffe.model）的大小有528M，不是一个数量级的。<br />
	<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
	然后把摄像头连接到PC上，就可以跑Darknet的demo了：</p>
<blockquote>
<p>
			<span style="color:#800080;">python</span> darknetDemo.py</p>
</blockquote>
<p>	一切如此简单。BTW，这个demo跑起来确实比CNTK的速度快多了。</p>
<p>	跑Darknet的demo成功之后，我们就可以像前面说的一样，为目标平台（树莓派）编译Darknet的代码了：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
<span style="color: rgb(255, 176, 84);">cd</span> ELL/build
make compiled_darknetReference_pi3</pre>
<p>	这回进程终于没有被kernel kill掉了！看来CNTK的编译过程果然占用了太多OS资源。</p>
<p>	编译成功之后，你可以用任何方式（例如FileZilla，命令行SCP等等），把&nbsp;<span style="color:#0000ff;">ELL/build/tutorials/vision/gettingStarted/compiled_darknetReference_pi3/</span> 这个编译生成的目录，拷贝到树莓派上，我们之后需要在树莓派上，用这个目录下的东西，进一步编译出一个Python module，从而可以在树莓派上跑image classification的demo。<br />
	<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
	仔细看一下就会发现，Darknet的这个目录只有区区的29M大小，而CNTK对应的那个目录&nbsp;compiled_vgg16ImageNet_pi3 竟然有3.4G大！这意味着，如果你决定在CNTK一条路上走到黑，那么你树莓派OS的TF卡就必须至少有16G大，否则你很可能由于OS里还要装一些必备软件，从而导致存储不下这个目录了。我之前尝试用CNTK的时候，就遇到了这样的问题，导致我要<a href="https://www.codelast.com/?p=9536" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">用GParted去resize树莓派的TF卡</span></a>。</p>
<p>	至此，我们终于完成了在PC上的所有工作，下一步，我们的工作就要转到树莓派上了，请接着看下一篇文章。<br />
	<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
	<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;版权声明&nbsp;<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;<br />
	转载需注明出处：<u><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><em><span style="color: rgb(0, 0, 255);"><strong style="font-size: 16px;"><span style="font-family: arial, helvetica, sans-serif;">codelast.com</span></strong></span></em></a></u>&nbsp;<br />
	感谢关注我的微信公众号（微信扫一扫）：</p>
<p style="border: 0px; font-size: 13px; margin: 0px 0px 9px; outline: 0px; padding: 0px; color: rgb(77, 77, 77);">
		<img decoding="async" alt="wechat qrcode of codelast" src="https://www.codelast.com/codelast_wechat_qr_code.jpg" style="width: 200px; height: 200px;" /></p>
</div>
]]></content:encoded>
					
					<wfw:commentRss>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%933/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>[原创] 在树莓派3上使用微软ELL嵌入式学习库(2)</title>
		<link>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%932/</link>
					<comments>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%932/#respond</comments>
		
		<dc:creator><![CDATA[learnhard]]></dc:creator>
		<pubDate>Mon, 07 Aug 2017 16:01:19 +0000</pubDate>
				<category><![CDATA[Linux]]></category>
		<category><![CDATA[Raspberry Pi/树莓派]]></category>
		<category><![CDATA[原创]]></category>
		<category><![CDATA[Embedded Learning Library]]></category>
		<category><![CDATA[Microsoft ELL]]></category>
		<category><![CDATA[Raspberry Pi]]></category>
		<category><![CDATA[机器视觉]]></category>
		<category><![CDATA[树莓派]]></category>
		<guid isPermaLink="false">https://www.codelast.com/?p=9635</guid>

					<description><![CDATA[<p>
本文是<a href="https://www.codelast.com/?p=9401" rel="noopener noreferrer" target="_blank"><span style="background-color: rgb(255, 160, 122);">上一篇文章</span></a>的续文。<br />
微软于2017年6月底发布了一个主要用于嵌入式系统（例如，树莓派，ARM Cortex-M0等）的机器学习库<a href="https://github.com/Microsoft/ELL" rel="noopener noreferrer" target="_blank"><span style="color: rgb(0, 0, 255);"><span style="background-color: rgb(255, 160, 122);">ELL</span></span></a>（<span style="color: rgb(0, 0, 255);">Embedded Learning Library</span>，<span style="color: rgb(0, 0, 255);">嵌入式学习库</span>）。<br />
要在树莓派上使用pre-trained的模型，我们先要在PC上对其进行测试，这个测试说白了就是运行一些Python的demo程序看它们是否能正常工作。而上一篇文章，正是介绍了在Ubuntu PC上的准备工作&#8212;&#8212;如果没有那些准备工作，你连测试的基础条件都不具备。<br />
注：<span style="color: rgb(0, 0, 255);">本文的所有操作，都是在台式机上运行的。</span><br />
<span id="more-9635"></span></p>
<ul>
<li>
		<span style="background-color: rgb(221, 160, 221);">选择模型</span></li>
</ul>
<p>至此，我们终于到模型这一步了。<br />
由于训练机器学习模型的时间通常较长，所以，训练模型这个工作肯定不能放在树莓派上干。我们可以使用别人已经预先训练好的模型来节省时间，ELL文档里推荐了两个：<a href="https://github.com/Microsoft/ELL/blob/master/tutorials/vision/gettingStarted/cntk.md" rel="noopener noreferrer" target="_blank"><span style="background-color: rgb(255, 160, 122);">CNTK</span></a>和<a href="https://github.com/Microsoft/ELL/blob/master/tutorials/vision/gettingStarted/darknet.md" rel="noopener noreferrer" target="_blank"><span style="background-color: rgb(255, 160, 122);">Darknet</span></a>。<br />
<span style="color: rgb(255, 0, 0);">本文在PC上测试ELL时，将主要使用CNTK；而在树莓派上测试ELL时，出于运行速度等种种原因考虑，不得不使用Darknet（和Darknet相比，CNTK不够轻量级）</span>。</p>
<ul>
<li>
		<span style="background-color: rgb(221, 160, 221);">安装CNTK相关的软件</span></li>
</ul>
<p>在前面的conda环境下，安装CNTK的Python package：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &#34;Lucida Console&#34;, &#34;DejaVu Sans Mono&#34;, Monaco, &#34;Courier New&#34;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
pip install https://cntk.ai/PythonWheel/CPU-Only/cntk-2.0-cp36-cp36m-linux_x86_64.whl</pre>
<p>输出类似于：</p>
<blockquote>
<div>
		Collecting cntk==2.0 from https://cntk.ai/PythonWheel/CPU-Only/cntk-2.0-cp36-cp36m-linux_x86_64.whl</div>
<div>
		&#160; Downloading https://cntk.ai/PythonWheel/CPU-Only/cntk-2.0-cp36-cp36m-linux_x86_64.whl (109.5MB)<br />
		......</div></blockquote>&#8230; <a href="https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%932/" class="read-more">Read More </a>]]></description>
										<content:encoded><![CDATA[<p>
本文是<a href="https://www.codelast.com/?p=9401" rel="noopener noreferrer" target="_blank"><span style="background-color: rgb(255, 160, 122);">上一篇文章</span></a>的续文。<br />
微软于2017年6月底发布了一个主要用于嵌入式系统（例如，树莓派，ARM Cortex-M0等）的机器学习库<a href="https://github.com/Microsoft/ELL" rel="noopener noreferrer" target="_blank"><span style="color: rgb(0, 0, 255);"><span style="background-color: rgb(255, 160, 122);">ELL</span></span></a>（<span style="color: rgb(0, 0, 255);">Embedded Learning Library</span>，<span style="color: rgb(0, 0, 255);">嵌入式学习库</span>）。<br />
要在树莓派上使用pre-trained的模型，我们先要在PC上对其进行测试，这个测试说白了就是运行一些Python的demo程序看它们是否能正常工作。而上一篇文章，正是介绍了在Ubuntu PC上的准备工作&mdash;&mdash;如果没有那些准备工作，你连测试的基础条件都不具备。<br />
注：<span style="color: rgb(0, 0, 255);">本文的所有操作，都是在台式机上运行的。</span><br />
<span id="more-9635"></span></p>
<ul>
<li>
		<span style="background-color: rgb(221, 160, 221);">选择模型</span></li>
</ul>
<p>至此，我们终于到模型这一步了。<br />
由于训练机器学习模型的时间通常较长，所以，训练模型这个工作肯定不能放在树莓派上干。我们可以使用别人已经预先训练好的模型来节省时间，ELL文档里推荐了两个：<a href="https://github.com/Microsoft/ELL/blob/master/tutorials/vision/gettingStarted/cntk.md" rel="noopener noreferrer" target="_blank"><span style="background-color: rgb(255, 160, 122);">CNTK</span></a>和<a href="https://github.com/Microsoft/ELL/blob/master/tutorials/vision/gettingStarted/darknet.md" rel="noopener noreferrer" target="_blank"><span style="background-color: rgb(255, 160, 122);">Darknet</span></a>。<br />
<span style="color: rgb(255, 0, 0);">本文在PC上测试ELL时，将主要使用CNTK；而在树莓派上测试ELL时，出于运行速度等种种原因考虑，不得不使用Darknet（和Darknet相比，CNTK不够轻量级）</span>。</p>
<ul>
<li>
		<span style="background-color: rgb(221, 160, 221);">安装CNTK相关的软件</span></li>
</ul>
<p>在前面的conda环境下，安装CNTK的Python package：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
pip install https://cntk.ai/PythonWheel/CPU-Only/cntk-2.0-cp36-cp36m-linux_x86_64.whl</pre>
<p>输出类似于：</p>
<blockquote>
<div>
		Collecting cntk==2.0 from https://cntk.ai/PythonWheel/CPU-Only/cntk-2.0-cp36-cp36m-linux_x86_64.whl</div>
<div>
		&nbsp; Downloading https://cntk.ai/PythonWheel/CPU-Only/cntk-2.0-cp36-cp36m-linux_x86_64.whl (109.5MB)<br />
		......</div>
<div>
		Successfully installed cntk-2.0</div>
</blockquote>
<p>这一步耗时较长，耐心等待吧。<br />
然后回到ELL代码根目录，切换到gettingStarted目录并下载CNTK的ImageNet推断模型：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
<span style="color: rgb(255, 176, 84);">cd</span> build/tutorials/vision/gettingStarted
curl -O https://www.cntk.ai/Models/Caffe_Converted/VGG16_ImageNet_Caffe.model</pre>
<p></p>
<ul>
<li>
		<span style="background-color: rgb(221, 160, 221);">测试模型</span></li>
</ul>
<p>下面可以测试一下模型了：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
<span style="color: rgb(225, 239, 255);">(</span>py36<span style="color: rgb(225, 239, 255);">)</span> [codelast@ gettingStarted]$ python cntkDemo.py</pre>
<p>
如果此时报错：</p>
<div>
<blockquote>
<div>
			Traceback (most recent call last):</div>
<div>
			&nbsp; File &quot;cntkDemo.py&quot;, line 4, in &lt;module&gt;</div>
<div>
			&nbsp; &nbsp; import cv2</div>
<div>
			ImportError: /home/codelast/.miniconda3/envs/py36/lib/python3.6/site-packages/../../libopencv_dnn.so.3.2: undefined symbol: openblas_get_num_threads</div>
</blockquote>
<div>
		那么说明在miniconda环境下没有装openblas，安装方法：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
conda install openblas</pre>
<p>		然后再试。<br />
		<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
		如果遇到&ldquo;ImportError: libmpi_cxx.so.1: cannot open shared object file: No such file or directory&rdquo;这种错误，说明系统里没有安装Open MPI开发包，理论上应该用下面的命令安装：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
sudo apt-get install libopenmpi-dev</pre>
<div>
			但是，当你安装好之后，重新运行<span style="color: rgb(0, 0, 255);">&nbsp;cntkDemo.py</span>，不出意外的话又会遇到&ldquo;ImportError: libmpi.so.12: cannot open shared object file: No such file or directory&rdquo;的错误，这其实是Open MPI版本太低导致的，解决方案请看<a href="https://www.codelast.com/?p=9464" rel="noopener noreferrer" target="_blank"><span style="background-color: rgb(255, 160, 122);">这个链接</span></a>。</p>
<p>			解决了那么那么多的问题，现在总可以了吧？再次执行&nbsp;<span style="color: rgb(0, 0, 255);">cntkDemo.py</span>，等待它运行了一段时间之后，一开始看上去还挺正常的，执行到后面又出错了：</p>
<blockquote>
<div>
					...Finished constructing ELL layers.</div>
<div>
					OpenCV Error: Unspecified error (The function is not implemented. Rebuild the library with Windows, GTK+ 2.x or Carbon support. If you are on Ubuntu or Debian, install libgtk2.0-dev and pkg-config, then re-run cmake or configure script) in cvShowImage, file /feedstock_root/build_artefacts/opencv_1490907195496/work/opencv-3.2.0/modules/highgui/src/window.cpp, line 583</div>
<div>
					Traceback (most recent call last):</div>
<div>
					&nbsp; File &quot;cntkDemo.py&quot;, line 68, in &lt;module&gt;</div>
<div>
					&nbsp; &nbsp; main()</div>
<div>
					&nbsp; File &quot;cntkDemo.py&quot;, line 61, in main</div>
<div>
					&nbsp; &nbsp; cv2.imshow(&#39;frame&#39;, frameToShow)</div>
<div>
					......</div>
</blockquote>
<div>
				这个问题，就是我上一篇文章里所说的，按ELL官方文档里的方法去安装OpenCV会导致的问题，在各种坑中，这是最坑爹的一个了。这个问题的解决办法请看<a href="https://www.codelast.com/?p=9473" rel="noopener noreferrer" target="_blank"><span style="background-color: rgb(255, 160, 122);">这个链接</span></a>。<br />
				<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a></div>
</p></div>
<div>
			此时再重新执行&nbsp;<span style="color: rgb(0, 0, 255);">cntkDemo.py</span>，确实不报错了，但是，执行到弹出GUI窗口显示摄像头拍摄的视频流的代码的时候，程序进入假死状态，不能执行后续逻辑。此时我的内心是崩溃的。本着一定要把它搞定的决心，我又做了一些尝试，使得我能够跑起来这个demo，具体请看<a href="https://www.codelast.com/?p=9594" rel="noopener noreferrer" target="_blank"><span style="background-color: rgb(255, 160, 122);">这篇文章</span></a>。</p>
<p>			当在PC上成功地跑起来了CNTK的demo之后，我们就可以认为这个model是work的，然后就要准备把它弄到树莓派上去跑了。<br />
			<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
			<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;版权声明&nbsp;<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;<br />
			转载需注明出处：<u><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><em><span style="color: rgb(0, 0, 255);"><strong style="font-size: 16px;"><span style="font-family: arial, helvetica, sans-serif;">codelast.com</span></strong></span></em></a></u>&nbsp;<br />
			感谢关注我的微信公众号（微信扫一扫）：</p>
<p style="border: 0px; font-size: 13px; margin: 0px 0px 9px; outline: 0px; padding: 0px; color: rgb(77, 77, 77);">
				<img decoding="async" alt="wechat qrcode of codelast" src="https://www.codelast.com/codelast_wechat_qr_code.jpg" style="width: 200px; height: 200px;" /></p>
</p></div>
</p></div>
</div>
]]></content:encoded>
					
					<wfw:commentRss>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%932/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>[原创] 在树莓派3上使用微软ELL嵌入式学习库(1)</title>
		<link>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%931/</link>
					<comments>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%931/#comments</comments>
		
		<dc:creator><![CDATA[learnhard]]></dc:creator>
		<pubDate>Sun, 06 Aug 2017 16:00:06 +0000</pubDate>
				<category><![CDATA[Linux]]></category>
		<category><![CDATA[Raspberry Pi/树莓派]]></category>
		<category><![CDATA[原创]]></category>
		<category><![CDATA[Embedded Learning Library]]></category>
		<category><![CDATA[Microsoft ELL]]></category>
		<category><![CDATA[Raspberry Pi]]></category>
		<category><![CDATA[机器视觉]]></category>
		<category><![CDATA[树莓派]]></category>
		<guid isPermaLink="false">https://www.codelast.com/?p=9401</guid>

					<description><![CDATA[<p>
微软于2017年6月底发布了一个主要用于嵌入式系统（例如，树莓派，ARM Cortex-M0等）的机器学习库<a href="https://github.com/Microsoft/ELL" rel="noopener noreferrer" target="_blank"><span style="color:#0000ff;"><span style="background-color:#ffa07a;">ELL</span></span></a>（<span style="color:#0000ff;">Embedded Learning Library</span>，<span style="color:#0000ff;">嵌入式学习库</span>）。由于嵌入式设备的计算能力较弱，因此在这些设备上执行一些机器学习的任务&#8212;&#8212;例如实时图像分类&#8212;&#8212;通常速度很慢，所以在这种应用场景下，一般的策略是把请求发送到计算能力强大的云端服务器上去执行，嵌入式设备只作为和用户交互的终端，并不执行关键的计算任务。而微软发布的这个ELL，目标在于把云端的计算任务转移到嵌入式设备上，从而可以使得设备无需联网也能执行这些任务。这个目标看起来很诱人，但它要求ELL的计算速度很快、很节省资源，否则耗时将是不可接受的。<br />
<span id="more-9401"></span><br />
我在树莓派3上试验过用Tensorflow来进行图像分类（看这里：<a href="https://www.codelast.com/?p=8941" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">链接1</span></a>，<a href="https://www.codelast.com/?p=8984" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">链接2</span></a>，<a href="https://www.codelast.com/?p=8995" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">链接3</span></a>），结果是：速度很慢，推断一张图片的类别要花4秒多时间。当然，这个时间肯定可以通过优化缩短，并且它使用的是Inception-v3模型，而不是MobileNet之类专门为移动设备设计的模型。<br />
微软的ELL就是为了解决这个痛点而生。<br />
在ELL发布不久后，我很好奇它在树莓派3代上的表现会如何，是不是真的很&#8220;快&#8221;，能达到非常实用的程度呢？<br />
于是，我开始了ELL试用之路。<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
首先需要说明的一点是，在开始写文本时，由于ELL刚发布不久，其开发团队对这个项目处于频繁的更新中，所以很多文档、代码可能修改得很快、前后版本相差很大，所以，如果本文陈述的某些内容与你看到的最新版ELL有所不同，请不要觉得奇怪。<br />
此外，在整个测试过程中我遇到了很多坑，ELL文档上有很多没有写清楚、还有些错误的地方，直到后面某一步操作时触发了这些问题才暴露出来，我又不得不回头去看之前的哪一步需要fix问题，所以，如果你也想试一下ELL，那么先看到本文的话，可能会让你少走些弯路。</p>
<p><span style="background-color:#00ff00;">『1』</span>拿ELL来做什么？<br />
首先，我们要拿ELL来实现一个什么功能？<br />
微软官方给出的一个指导文档是《Getting Started with Computer Vision》，也就是说我们要在嵌入式系统上用ELL实现一个机器视觉的应用，以微软官方提供的下面这个图片为例：<br />
<img decoding="async" alt="" src="https://www.codelast.com/wp-content/uploads/ckfinder/images/coffeemug.jpg" style="width: 643px; height: 513px;" /><br />
即：用树莓派的摄像头拍摄一个物体，运行在树莓派上的ELL程序可以识别出它&#8220;是什么&#8221;。<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
<span style="background-color:#00ff00;">『2』</span>软硬件准备</p>
<ul>
<li>
		树莓派3代（OS：Arch Linux ARM）</li>
<li>
		用于树莓派的USB摄像头（也可以用和树莓派配套的那个专用摄像头，但价格比较贵，我手上没有）</li>
<li>
		台式机（OS：Ubuntu 14.04 LTS）</li>
</ul>
<p><span style="background-color:#00ff00;">『3』</span>台式机上的准备工作<br />
不要以为我们让模型跑在树莓派上，就没有台式机什么事了。实际上，我们需要先在台式机上做非常多的工作，然后才能在树莓派上跑ELL。<br />
所以，本文讲的就是我们要在台式机上做的那些繁琐工作，<span style="color:#0000ff;">切记：本文的所有操作，都是在台式机上运行的，跟树莓派没有关系</span>。</p>
<ul>
<li>
		<span style="background-color:#dda0dd;">下载ELL的代码</span></li>
</ul>
<p>首先我们从Github上检出ELL的代码，尽管在网页上看ELL的文档更方便一些，但是后面我们是需要用到这份代码的，所以这一步肯定要做：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &#34;Lucida Console&#34;, &#34;DejaVu Sans Mono&#34;, Monaco, &#34;Courier New&#34;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
git clone https://github.com/Microsoft/ELL.git</pre>&#8230; <a href="https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%931/" class="read-more">Read More </a>]]></description>
										<content:encoded><![CDATA[<p>
微软于2017年6月底发布了一个主要用于嵌入式系统（例如，树莓派，ARM Cortex-M0等）的机器学习库<a href="https://github.com/Microsoft/ELL" rel="noopener noreferrer" target="_blank"><span style="color:#0000ff;"><span style="background-color:#ffa07a;">ELL</span></span></a>（<span style="color:#0000ff;">Embedded Learning Library</span>，<span style="color:#0000ff;">嵌入式学习库</span>）。由于嵌入式设备的计算能力较弱，因此在这些设备上执行一些机器学习的任务&mdash;&mdash;例如实时图像分类&mdash;&mdash;通常速度很慢，所以在这种应用场景下，一般的策略是把请求发送到计算能力强大的云端服务器上去执行，嵌入式设备只作为和用户交互的终端，并不执行关键的计算任务。而微软发布的这个ELL，目标在于把云端的计算任务转移到嵌入式设备上，从而可以使得设备无需联网也能执行这些任务。这个目标看起来很诱人，但它要求ELL的计算速度很快、很节省资源，否则耗时将是不可接受的。<br />
<span id="more-9401"></span><br />
我在树莓派3上试验过用Tensorflow来进行图像分类（看这里：<a href="https://www.codelast.com/?p=8941" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">链接1</span></a>，<a href="https://www.codelast.com/?p=8984" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">链接2</span></a>，<a href="https://www.codelast.com/?p=8995" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">链接3</span></a>），结果是：速度很慢，推断一张图片的类别要花4秒多时间。当然，这个时间肯定可以通过优化缩短，并且它使用的是Inception-v3模型，而不是MobileNet之类专门为移动设备设计的模型。<br />
微软的ELL就是为了解决这个痛点而生。<br />
在ELL发布不久后，我很好奇它在树莓派3代上的表现会如何，是不是真的很&ldquo;快&rdquo;，能达到非常实用的程度呢？<br />
于是，我开始了ELL试用之路。<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
首先需要说明的一点是，在开始写文本时，由于ELL刚发布不久，其开发团队对这个项目处于频繁的更新中，所以很多文档、代码可能修改得很快、前后版本相差很大，所以，如果本文陈述的某些内容与你看到的最新版ELL有所不同，请不要觉得奇怪。<br />
此外，在整个测试过程中我遇到了很多坑，ELL文档上有很多没有写清楚、还有些错误的地方，直到后面某一步操作时触发了这些问题才暴露出来，我又不得不回头去看之前的哪一步需要fix问题，所以，如果你也想试一下ELL，那么先看到本文的话，可能会让你少走些弯路。</p>
<p><span style="background-color:#00ff00;">『1』</span>拿ELL来做什么？<br />
首先，我们要拿ELL来实现一个什么功能？<br />
微软官方给出的一个指导文档是《Getting Started with Computer Vision》，也就是说我们要在嵌入式系统上用ELL实现一个机器视觉的应用，以微软官方提供的下面这个图片为例：<br />
<img decoding="async" alt="" src="https://www.codelast.com/wp-content/uploads/ckfinder/images/coffeemug.jpg" style="width: 643px; height: 513px;" /><br />
即：用树莓派的摄像头拍摄一个物体，运行在树莓派上的ELL程序可以识别出它&ldquo;是什么&rdquo;。<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
<span style="background-color:#00ff00;">『2』</span>软硬件准备</p>
<ul>
<li>
		树莓派3代（OS：Arch Linux ARM）</li>
<li>
		用于树莓派的USB摄像头（也可以用和树莓派配套的那个专用摄像头，但价格比较贵，我手上没有）</li>
<li>
		台式机（OS：Ubuntu 14.04 LTS）</li>
</ul>
<p><span style="background-color:#00ff00;">『3』</span>台式机上的准备工作<br />
不要以为我们让模型跑在树莓派上，就没有台式机什么事了。实际上，我们需要先在台式机上做非常多的工作，然后才能在树莓派上跑ELL。<br />
所以，本文讲的就是我们要在台式机上做的那些繁琐工作，<span style="color:#0000ff;">切记：本文的所有操作，都是在台式机上运行的，跟树莓派没有关系</span>。</p>
<ul>
<li>
		<span style="background-color:#dda0dd;">下载ELL的代码</span></li>
</ul>
<p>首先我们从Github上检出ELL的代码，尽管在网页上看ELL的文档更方便一些，但是后面我们是需要用到这份代码的，所以这一步肯定要做：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
git clone https://github.com/Microsoft/ELL.git</pre>
<p></p>
<ul>
<li>
		<span style="background-color:#dda0dd;">安装基础软件依赖</span></li>
</ul>
<p>
	除此之外，编译ELL还对系统有很多软件依赖，例如cmake，llvm之类，具体请看<a href="http://github.com/Microsoft/ELL/blob/master/INSTALL-Ubuntu.md" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">这个</span></a>ELL的文档。整个安装、解决问题的过程真的相当麻烦，要有充分的心理准备。<br />
	<span style="color:#b22222;">第1点注意</span>：cmake要装3.3版的，ELL的文档里没有写，如果你用 apt-get install cmake 安装，那么装上去的版本可能不符合要求，最后一定会出问题，导致整个流程走不下去。<br />
	所以我是这样装的cmake：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
apt-cache show cmake3
sudo apt-get install cmake3</pre>
<p>
	<span style="color: rgb(178, 34, 34);">第2点注意</span>：ELL对gcc版本有要求，但ELL的doc里没写。具体请看<a href="https://www.codelast.com/?p=9496" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">这篇文章</span></a>。</p>
<p><span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a></p>
<ul>
<li>
		<span style="background-color:#dda0dd;">安装Python 3.6&mdash;&mdash;通过miniconda</span></li>
</ul>
<p>为什么要用Python 3.6？因为ELL的demo程序就是Python 3.6的。<br />
根据ELL文档的建议，我们不应该&ldquo;直接&rdquo;在系统里安装Python 3.6，而是通过conda环境来安装，具体请看<a href="https://www.codelast.com/?p=9493" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">这篇文章</span></a>。</p>
<ul>
<li>
		<span style="background-color:#dda0dd;">在miniconda环境下安装必需的软件包</span></li>
</ul>
<p>安装好 miniconda 之后，需要安装 curl，numpy 和 <span style="color:#0000ff;">opencv</span>（如果没有安装过的话）：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
conda install curl
conda install numpy</pre>
<p>你一定感觉很奇怪：上面的命令并没有安装OpenCV啊？！没错，这是因为，按照ELL文档的做法（<span style="color:#008000;">conda install -c conda-forge opencv</span>）安装上的OpenCV是有问题的&mdash;&mdash;至少在我的Ubuntu上不能用。如果你着急的话，可以直接看<a href="https://www.codelast.com/?p=9473" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">这篇文章</span></a>的解决方案；如果你不急的话，可以暂时按ELL文档的说明去安装OpenCV，然后跟着本系列文章的节奏，一点点地发现问题、解决问题（没错，后面的章节还有讲到这个问题）。</p>
<ul>
<li>
		<span style="background-color:#dda0dd;">安装SWIG</span></li>
</ul>
<p>下一步，你需要安装用于生成language binding（语言绑定）的SWIG软件（和conda环境无关）。比如我们要用Python来调用ELL，那么就需要Python binding，诸如此类。<br />
在Ubuntu 14.04上，用 apt-get install swig3.0 安装上的SWIG是3.0.2版本，满足不了ELL要求的&ldquo;version 3.0.12 or later&rdquo;，所以，我们只能自己下载、编译SWIG了：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
wget https://nchc.dl.sourceforge.net/project/swig/swig/swig-3.0.12/swig-3.0.12.tar.gz
tar zxf swig-3.0.12.tar.gz
<span style="color: rgb(255, 176, 84);">cd</span> swig-3.0.12
./configure --without-pcre <span style="color: rgb(255, 157, 0);">&amp;&amp;</span> make <span style="color: rgb(255, 157, 0);">&amp;&amp;</span> sudo make install</pre>
<p></p>
<ul>
<li>
		<span style="background-color:#dda0dd;">编译ELL</span></li>
</ul>
<p>下面可以开始编译ELL了。这里的编译过程<span style="color:#ff0000;">不要</span>在conda环境下执行。<br />
回到之前检出的ELL代码的根目录下，执行下面的命令：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
mkdir build
<span style="color: rgb(255, 176, 84);">cd</span> build
cmake ..
make</pre>
<p>如果不出错，那表明你运气是真的好。<br />
我记得我是编译到61％的时候挂掉的，错误信息如下：</p>
<blockquote>
<div>
		[ 0%] Built target documentation</div>
<div>
		......</div>
<div>
		[ 61%] Linking CXX executable common_test</div>
<div>
		../math/libmath.a(BlasWrapper.cpp.o): In function ell::math::Blas::Copy(int, float const*, int, float*, int)&#39;: BlasWrapper.cpp:(.text+0x31): undefined reference tocblas_scopy&#39;</div>
<div>
		......</div>
<div>
		../math/libmath.a(BlasWrapper.cpp.o): In function ell::math::Blas::Gemm(CBLAS_ORDER, CBLAS_TRANSPOSE, CBLAS_TRANSPOSE, int, int, int, double, double const*, int, double const*, int, double, double*, int)&#39;: BlasWrapper.cpp:(.text+0x471): undefined reference tocblas_dgemm&#39;</div>
<div>
		collect2: error: ld returned 1 exit status</div>
<div>
		make[2]: *** [libraries/common/common_test] Error 1</div>
<div>
		make[1]: *** [libraries/common/CMakeFiles/common_test.dir/all] Error 2</div>
<div>
		make: *** [all] Error 2</div>
</blockquote>
<div>
	这个问题的解决方法请看<a href="https://www.codelast.com/?p=9505" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">这篇文章</span></a>。<br />
	&nbsp;</div>
<ul>
<li>
		<span style="background-color:#dda0dd;">生成Python binding</span></li>
</ul>
<p>接下来，需要配置miniconda里的Python环境：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
<span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Create the environment</span>
conda create -n py36 anaconda python=3
<span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Activate the environment</span>
<span style="color: rgb(255, 176, 84);">source</span> activate py36
<span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> ELL requires gcc 5 and above for C++14. Upgrade anaconda&#39;s environment to support it.</span>
conda install libgcc</pre>
<p>注意：第二句命令是 <span style="color:#0000ff;">source activate py36</span>，而ELL的文档里是 activate py36，一执行就报错，会提示你要在前面加source。<br />
这一步耗时较长，等它完成之后，我们就可以生成Python binding了。在同一个命令行窗口里（刚激活了py36环境），到之前检出的ELL代码的根目录下，执行下面的命令（其实前3行命令前面已经执行过了，如果没有删除build目录的话可以不再重新执行）：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
mkdir build
<span style="color: rgb(255, 176, 84);">cd</span> build
cmake ..
make _ELL_python </pre>
<p>如果不出错的话，Python binding就生成成功了，但是你如果在 cmake 那一步遇到这样的错误：</p>
<blockquote>
<div>
		CMake Error at CMakeLists.txt:5 (cmake_minimum_required):</div>
<div>
		&nbsp; CMake 3.3 or higher is required. &nbsp;You are running version 2.8.12.2</div>
<div>
		-- Configuring incomplete, errors occurred!</div>
</blockquote>
<p>这就是本文前面的章节说过的cmake版本过低导致的，前面已经说过了如何安装高版本的cmake，这里不再复述。<br />
然后再重新执行cmake及其后步骤，OK的话，就可以测试一下Python binding是否能正常工作。回到ELL代码的根目录，然后：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
<span style="color: rgb(225, 239, 255);">(</span>py36<span style="color: rgb(225, 239, 255);">)</span> [codelast@ ELL]$ <span style="color: rgb(255, 176, 84);">cd</span> build/interfaces/python/test
<span style="color: rgb(225, 239, 255);">(</span>py36<span style="color: rgb(225, 239, 255);">)</span> [codelast@ <span style="color: rgb(255, 176, 84);">test</span>]$ python <span style="color: rgb(255, 176, 84);">test</span>.py</pre>
<p>注意：一定要cd到 build/interface/python/test/ 这个目录下跑test！之所以强调这一点，是因为在ELL代码根目录下，还有一个一模一样的&nbsp;interfaces/python/test/ 目录，如果你弄错了，到这个目录下跑test，会发现console没有任何输出信息，很迷惑人。<br />
正常的test输出信息类似于下面这样：</p>
<blockquote>
<div>
		OpenBLAS : Your OS does not support AVX instructions. OpenBLAS is using Nehalem kernels as a fallback, which may give poorer performance.</div>
<div>
		Testing HingeLoss.Evaluate(2, 1) ... Passed</div>
<div>
		Testing HingeLoss.Evaluate(-2, -1) ... Passed</div>
<div>
		<span style="color:#b22222;">（中间还有很多，此处省略）</span></div>
<div>
		Testing SquaredLoss.GetDerivative(4, 2) ... Passed</div>
<div>
		Testing SquaredLoss.GetDerivative(2, 4) ... Passed</div>
<div>
		functions_test passed</div>
<div>
		[1]<span style="white-space:pre"> </span>One subgraph</div>
<div>
		&nbsp; &nbsp; Subgraph Vertices Edges Cycles</div>
<div>
		&nbsp; &nbsp; 0 &nbsp; &nbsp; &nbsp; &nbsp;6 &nbsp; &nbsp; &nbsp; &nbsp;8 &nbsp; &nbsp; 3 &nbsp; &nbsp;</div>
<div>
		[2]<span style="white-space:pre"> </span>One subgraph</div>
<div>
		&nbsp; &nbsp; Subgraph Vertices Edges Cycles</div>
<div>
		&nbsp; &nbsp; 0 &nbsp; &nbsp; &nbsp; &nbsp;6 &nbsp; &nbsp; &nbsp; &nbsp;6 &nbsp; &nbsp; 1 &nbsp; &nbsp;</div>
<div>
		<span style="color:#b22222;">（中间还有很多，此处省略）</span><br />
		[tree_3]<span style="white-space: pre;"> </span>One subgraph</div>
<div>
		&nbsp; &nbsp; Subgraph Vertices Edges Cycles</div>
<div>
		&nbsp; &nbsp; 0 &nbsp; &nbsp; &nbsp; &nbsp;101 &nbsp; &nbsp; &nbsp;126 &nbsp; 26 &nbsp;&nbsp;</div>
<div>
		model_test passed</div>
<div>
		Model 1 size: 6</div>
<div>
		Model 2 size: 6</div>
<div>
		Model 3 size: 8</div>
<div>
		Tree 0 size: 17</div>
<div>
		Tree 1 size: 45</div>
<div>
		Tree 2 size: 73</div>
<div>
		Tree 3 size: 101</div>
<div>
		Loading file ../../../examples/data/model_1.model</div>
<div>
		Loading file ../../../examples/data/model_2.model</div>
<div>
		common_test passed</div>
<div>
		trainers_test.test -- TBD</div>
<div>
		trainers_test passed</div>
<div>
		predictors_test.test -- TBD</div>
<div>
		predictors_test passed</div>
<div>
		nodes_test.test -- TBD</div>
<div>
		nodes_test passed</div>
<div>
		linear_test.test -- TBD</div>
<div>
		linear_test passed</div>
<div>
		evaluators_test.test -- TBD</div>
<div>
		evaluators_test passed</div>
<div>
		Testing ModelBuilder ... Passed</div>
<div>
		modelbuilder_test passed</div>
</blockquote>
<p>这说明Python binding可以正常工作了。<br />
下面，我们将进入和模型相关的测试，请看<a href="https://www.codelast.com/?p=9635" rel="noopener noreferrer" target="_blank"><span style="background-color:#ffa07a;">后一篇文章</span></a>。<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;版权声明&nbsp;<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;<br />
转载需注明出处：<u><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><em><span style="color: rgb(0, 0, 255);"><strong style="font-size: 16px;"><span style="font-family: arial, helvetica, sans-serif;">codelast.com</span></strong></span></em></a></u>&nbsp;<br />
感谢关注我的微信公众号（微信扫一扫）：</p>
<p style="border: 0px; font-size: 13px; margin: 0px 0px 9px; outline: 0px; padding: 0px; color: rgb(77, 77, 77);">
	<img decoding="async" alt="wechat qrcode of codelast" src="https://www.codelast.com/codelast_wechat_qr_code.jpg" style="width: 200px; height: 200px;" /></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e5%9c%a8%e6%a0%91%e8%8e%93%e6%b4%be3%e4%b8%8a%e4%bd%bf%e7%94%a8%e5%be%ae%e8%bd%afell%e5%b5%8c%e5%85%a5%e5%bc%8f%e5%ad%a6%e4%b9%a0%e5%ba%931/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
			</item>
		<item>
		<title>[原创] 执行ELL的demo程序cntkDemo.py时程序僵死的问题</title>
		<link>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e6%89%a7%e8%a1%8cell%e7%9a%84demo%e7%a8%8b%e5%ba%8fcntkdemo-py%e6%97%b6%e7%a8%8b%e5%ba%8f%e5%83%b5%e6%ad%bb%e7%9a%84%e9%97%ae%e9%a2%98/</link>
					<comments>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e6%89%a7%e8%a1%8cell%e7%9a%84demo%e7%a8%8b%e5%ba%8fcntkdemo-py%e6%97%b6%e7%a8%8b%e5%ba%8f%e5%83%b5%e6%ad%bb%e7%9a%84%e9%97%ae%e9%a2%98/#respond</comments>
		
		<dc:creator><![CDATA[learnhard]]></dc:creator>
		<pubDate>Thu, 20 Jul 2017 17:38:28 +0000</pubDate>
				<category><![CDATA[Linux]]></category>
		<category><![CDATA[原创]]></category>
		<category><![CDATA[cntkDemo.py]]></category>
		<category><![CDATA[ELL]]></category>
		<category><![CDATA[Embedded Learning Library]]></category>
		<category><![CDATA[僵死]]></category>
		<guid isPermaLink="false">https://www.codelast.com/?p=9594</guid>

					<description><![CDATA[<p>
OS：Ubuntu 14.04</p>
<p>在台式机上执行ELL的demo程序&#160;cntkDemo.py 时，可能会遇到程序僵死的问题。<br />
cntkDemo.py 这个程序会调用OpenCV，在一个GUI窗口中显示USB摄像头拍摄的实时视频流，而僵死的现象正是：执行到弹出GUI窗口显示摄像头拍摄的视频流的代码的时候，程序进入僵死状态，不能执行后续逻辑。此时，只能Ctrl+C终止掉程序。<br />
<span id="more-9594"></span><br />
我的Ubuntu 14.04是一台老爷机，性能非常差，我觉得这有可能程序僵死的原因之一？我试了几次都是这样，于是我打算换一个思路来跑这个demo，不再纠结于解决窗口僵死的问题。<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
先来看一下原版的 cntkDemo.py 部分代码：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &#34;Lucida Console&#34;, &#34;DejaVu Sans Mono&#34;, Monaco, &#34;Courier New&#34;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
    <span style="color: rgb(255, 157, 0);">while</span> (<span style="color: rgb(255, 98, 140);">True</span>):
        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Grab next frame</span>
        ret, frame <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(255, 238, 128);">cap.read<span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(225, 239, 255);">)</span></span>

        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Prepare the image to send to the model.</span>
        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> This involves scaling to the required input dimension and re-ordering from BGR to RGB</span>
        data <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(255, 238, 128);">helper.prepare_image_for_predictor</span></pre>&#8230; <a href="https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e6%89%a7%e8%a1%8cell%e7%9a%84demo%e7%a8%8b%e5%ba%8fcntkdemo-py%e6%97%b6%e7%a8%8b%e5%ba%8f%e5%83%b5%e6%ad%bb%e7%9a%84%e9%97%ae%e9%a2%98/" class="read-more">Read More </a>]]></description>
										<content:encoded><![CDATA[<p>
OS：Ubuntu 14.04</p>
<p>在台式机上执行ELL的demo程序&nbsp;cntkDemo.py 时，可能会遇到程序僵死的问题。<br />
cntkDemo.py 这个程序会调用OpenCV，在一个GUI窗口中显示USB摄像头拍摄的实时视频流，而僵死的现象正是：执行到弹出GUI窗口显示摄像头拍摄的视频流的代码的时候，程序进入僵死状态，不能执行后续逻辑。此时，只能Ctrl+C终止掉程序。<br />
<span id="more-9594"></span><br />
我的Ubuntu 14.04是一台老爷机，性能非常差，我觉得这有可能程序僵死的原因之一？我试了几次都是这样，于是我打算换一个思路来跑这个demo，不再纠结于解决窗口僵死的问题。<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
先来看一下原版的 cntkDemo.py 部分代码：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
    <span style="color: rgb(255, 157, 0);">while</span> (<span style="color: rgb(255, 98, 140);">True</span>):
        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Grab next frame</span>
        ret, frame <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(255, 238, 128);">cap.read<span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(225, 239, 255);">)</span></span>

        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Prepare the image to send to the model.</span>
        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> This involves scaling to the required input dimension and re-ordering from BGR to RGB</span>
        data <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(255, 238, 128);">helper.prepare_image_for_predictor<span style="color: rgb(225, 239, 255);">(</span>frame<span style="color: rgb(225, 239, 255);">)</span></span>

        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Get the model to classify the image, by returning a list of probabilities for the classes it can detect</span>
        predictions <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(255, 238, 128);">model.Predict<span style="color: rgb(225, 239, 255);">(</span>data<span style="color: rgb(225, 239, 255);">)</span></span>

        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Get the (at most) top 5 predictions that meet our threshold. This is returned as a list of tuples,</span>
        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> each with the text label and the prediction score.</span>
        top5 <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(255, 238, 128);">helper.get_top_n<span style="color: rgb(225, 239, 255);">(</span>predictions, <span style="color: rgb(255, 98, 140);">5</span><span style="color: rgb(225, 239, 255);">)</span></span>

        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Turn the top5 into a text string to display</span>
        text <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(58, 217, 0);">&quot;</span><span style="color: rgb(58, 217, 0);">&quot;</span>.<span style="color: rgb(255, 238, 128);">join<span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(225, 239, 255);">[</span><span style="color: rgb(128, 255, 187);">str</span><span style="color: rgb(225, 239, 255);">(</span>element<span style="color: rgb(225, 239, 255);">[</span><span style="color: rgb(255, 98, 140);">0</span><span style="color: rgb(225, 239, 255);">]</span><span style="color: rgb(225, 239, 255);">)</span> <span style="color: rgb(255, 157, 0);">+</span> <span style="color: rgb(58, 217, 0);">&quot;</span>(<span style="color: rgb(58, 217, 0);">&quot;</span> <span style="color: rgb(255, 157, 0);">+</span> <span style="color: rgb(128, 255, 187);">str</span><span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(128, 255, 187);">int</span><span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(255, 98, 140);">100</span><span style="color: rgb(255, 157, 0);">*</span>element<span style="color: rgb(225, 239, 255);">[</span><span style="color: rgb(255, 98, 140);">1</span><span style="color: rgb(225, 239, 255);">]</span><span style="color: rgb(225, 239, 255);">)</span><span style="color: rgb(225, 239, 255);">)</span> <span style="color: rgb(255, 157, 0);">+</span> <span style="color: rgb(58, 217, 0);">&quot;</span>%)  <span style="color: rgb(58, 217, 0);">&quot;</span> <span style="color: rgb(255, 157, 0);">for</span> element <span style="color: rgb(255, 157, 0);">in</span> top5<span style="color: rgb(225, 239, 255);">]</span><span style="color: rgb(225, 239, 255);">)</span></span>

        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Draw the text on the frame</span>
        frameToShow <span style="color: rgb(255, 157, 0);">=</span> frame
        <span style="color: rgb(255, 238, 128);">helper.draw_label<span style="color: rgb(225, 239, 255);">(</span>frameToShow, text<span style="color: rgb(225, 239, 255);">)</span></span>
        <span style="color: rgb(255, 238, 128);">helper.draw_fps<span style="color: rgb(225, 239, 255);">(</span>frameToShow<span style="color: rgb(225, 239, 255);">)</span></span>

        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Show the new frame</span>
        <span style="color: rgb(255, 238, 128);">cv2.imshow<span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(58, 217, 0);">&#39;</span>frame<span style="color: rgb(58, 217, 0);">&#39;</span>, frameToShow<span style="color: rgb(225, 239, 255);">)</span></span>

        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Wait for Esc key</span>
        <span style="color: rgb(255, 157, 0);">if</span> <span style="color: rgb(255, 238, 128);">cv2.waitKey<span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(255, 98, 140);">1</span><span style="color: rgb(225, 239, 255);">)</span></span> <span style="color: rgb(255, 157, 0);">&amp;</span> <span style="color: rgb(255, 98, 140);">0xFF</span> <span style="color: rgb(255, 157, 0);">==</span> <span style="color: rgb(255, 98, 140);">27</span>:
            <span style="color: rgb(255, 157, 0);">break</span></pre>
<p>
这段代码的注释非常清晰，它的功能是：在一个无限循环中，不断地去抓取USB摄像头拍摄的一帧图像，然后用model预测其分类及概率，最后再把预测结果叠加显示在GUI窗口中，类似于下面这样：<br />
<img decoding="async" alt="ELL coffee mug" src="https://www.codelast.com/wp-content/uploads/2020/04/ell_coffee_mug.jpg" style="width: 500px; height: 399px;" /><br />
既然 cntkDemo.py 主要是为了测试model能不能正常跑，那么我在命令行以文字形式显示预测结果也是一样的啊，没有必要非得在GUI窗口中展示。<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
于是我把程序改成了下面这样（完整程序）：</p>
<pre style="margin-top: 0px; margin-bottom: 0px; font-stretch: normal; font-size: 0.9333em; line-height: 1.5em; font-family: Consolas, &quot;Lucida Console&quot;, &quot;DejaVu Sans Mono&quot;, Monaco, &quot;Courier New&quot;, monospace; background: rgb(0, 34, 64); color: rgb(255, 255, 255);">
<span style="color: rgb(255, 157, 0);">import</span> sys
<span style="color: rgb(255, 157, 0);">import</span> os
<span style="color: rgb(255, 157, 0);">import</span> numpy <span style="color: rgb(255, 157, 0);">as</span> np
<span style="color: rgb(255, 157, 0);">import</span> cv2
<span style="color: rgb(255, 157, 0);">import</span> time

<span style="color: rgb(255, 157, 0);">import</span> findEll
<span style="color: rgb(255, 157, 0);">import</span> cntk_to_ell
<span style="color: rgb(255, 157, 0);">import</span> modelHelper <span style="color: rgb(255, 157, 0);">as</span> mh

<span style="color: rgb(255, 238, 128);">def</span> <span style="color: rgb(255, 221, 0);">get_ell_predictor</span><span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(204, 204, 204);">modelConfig</span><span style="color: rgb(225, 239, 255);">)</span><span style="color: rgb(225, 239, 255);">:</span>
    <span style="color: rgb(58, 217, 0);">&quot;&quot;&quot;</span>Imports a model and returns an ELL.Predictor.<span style="color: rgb(58, 217, 0);">&quot;&quot;&quot;</span>
    <span style="color: rgb(255, 157, 0);">return</span> <span style="color: rgb(255, 238, 128);">cntk_to_ell.predictor_from_cntk_model<span style="color: rgb(225, 239, 255);">(</span>modelConfig.model_files<span style="color: rgb(225, 239, 255);">[</span><span style="color: rgb(255, 98, 140);">0</span><span style="color: rgb(225, 239, 255);">]</span><span style="color: rgb(225, 239, 255);">)</span></span>

<span style="color: rgb(255, 238, 128);">def</span> <span style="color: rgb(255, 221, 0);">main</span><span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(225, 239, 255);">)</span><span style="color: rgb(225, 239, 255);">:</span>

    <span style="color: rgb(255, 157, 0);">if</span> (<span style="color: rgb(255, 157, 0);">not</span> <span style="color: rgb(255, 238, 128);">os.path.exists<span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(58, 217, 0);">&#39;</span>VGG16_ImageNet_Caffe.model<span style="color: rgb(58, 217, 0);">&#39;</span><span style="color: rgb(225, 239, 255);">)</span></span>):
        <span style="color: rgb(255, 157, 0);">print</span>(<span style="color: rgb(58, 217, 0);">&quot;</span>Please download the &#39;VGG16_ImageNet_Caffe.model&#39; file, see README.md<span style="color: rgb(58, 217, 0);">&quot;</span>)
        <span style="color: rgb(255, 238, 128);">sys.exit<span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(255, 98, 140);">1</span><span style="color: rgb(225, 239, 255);">)</span></span>
        
    <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> ModelConfig for VGG16 model from CNTK Model Gallery</span>
    <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Follow the instructions in README.md to download the model if you intend to use it.</span>
    helper <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(255, 238, 128);">mh.ModelHelper<span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(58, 217, 0);">&quot;</span>VGG16ImageNet<span style="color: rgb(58, 217, 0);">&quot;</span>, <span style="color: rgb(225, 239, 255);">[</span><span style="color: rgb(58, 217, 0);">&quot;</span>VGG16_ImageNet_Caffe.model<span style="color: rgb(58, 217, 0);">&quot;</span><span style="color: rgb(225, 239, 255);">]</span>, <span style="color: rgb(58, 217, 0);">&quot;</span>cntkVgg16ImageNetLabels.txt<span style="color: rgb(58, 217, 0);">&quot;</span>, <span style="color: rgb(204, 204, 204);">scaleFactor</span><span style="color: rgb(255, 157, 0);">=</span><span style="color: rgb(255, 98, 140);">1.0</span><span style="color: rgb(225, 239, 255);">)</span></span>

    <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Import the model</span>
    model <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(255, 238, 128);">get_ell_predictor<span style="color: rgb(225, 239, 255);">(</span>helper<span style="color: rgb(225, 239, 255);">)</span></span>

    <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Save the model</span>
    <span style="color: rgb(255, 238, 128);">helper.save_ell_predictor_to_file<span style="color: rgb(225, 239, 255);">(</span>model, <span style="color: rgb(58, 217, 0);">&quot;</span>vgg16ImageNet.map<span style="color: rgb(58, 217, 0);">&quot;</span><span style="color: rgb(225, 239, 255);">)</span></span>

    camera <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(255, 98, 140);">0</span>
    <span style="color: rgb(255, 157, 0);">if</span> (<span style="color: rgb(255, 238, 128);"><span style="color: rgb(255, 176, 84);">len</span><span style="color: rgb(225, 239, 255);">(</span>sys.argv<span style="color: rgb(225, 239, 255);">)</span></span> <span style="color: rgb(255, 157, 0);">&gt;</span> <span style="color: rgb(255, 98, 140);">1</span>):
        camera <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(255, 238, 128);"><span style="color: rgb(128, 255, 187);">int</span><span style="color: rgb(225, 239, 255);">(</span>sys.argv<span style="color: rgb(225, 239, 255);">[</span><span style="color: rgb(255, 98, 140);">1</span><span style="color: rgb(225, 239, 255);">]</span><span style="color: rgb(225, 239, 255);">)</span></span> 

    <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Start video capture device</span>
    cap <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(255, 238, 128);">cv2.VideoCapture<span style="color: rgb(225, 239, 255);">(</span>camera<span style="color: rgb(225, 239, 255);">)</span></span>

    <span style="color: rgb(255, 157, 0);">while</span> (<span style="color: rgb(255, 98, 140);">True</span>):
        <span style="color: rgb(255, 157, 0);">print</span>(<span style="color: rgb(58, 217, 0);">&#39;</span>Read a frame from camera...<span style="color: rgb(58, 217, 0);">&#39;</span>)
        ret, frame <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(255, 238, 128);">cap.read<span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(225, 239, 255);">)</span></span>

        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Prepare the image to send to the model.</span>
        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> This involves scaling to the required input dimension and re-ordering from BGR to RGB</span>
        data <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(255, 238, 128);">helper.prepare_image_for_predictor<span style="color: rgb(225, 239, 255);">(</span>frame<span style="color: rgb(225, 239, 255);">)</span></span>

        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Get the model to classify the image, by returning a list of probabilities for the classes it can detect</span>
        predictions <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(255, 238, 128);">model.Predict<span style="color: rgb(225, 239, 255);">(</span>data<span style="color: rgb(225, 239, 255);">)</span></span>

        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Get the (at most) top 5 predictions that meet our threshold. This is returned as a list of tuples,</span>
        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> each with the text label and the prediction score.</span>
        top5 <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(255, 238, 128);">helper.get_top_n<span style="color: rgb(225, 239, 255);">(</span>predictions, <span style="color: rgb(255, 98, 140);">5</span><span style="color: rgb(225, 239, 255);">)</span></span>

        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Turn the top5 into a text string to display</span>
        text <span style="color: rgb(255, 157, 0);">=</span> <span style="color: rgb(58, 217, 0);">&quot;</span><span style="color: rgb(58, 217, 0);">&quot;</span>.<span style="color: rgb(255, 238, 128);">join<span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(225, 239, 255);">[</span><span style="color: rgb(128, 255, 187);">str</span><span style="color: rgb(225, 239, 255);">(</span>element<span style="color: rgb(225, 239, 255);">[</span><span style="color: rgb(255, 98, 140);">0</span><span style="color: rgb(225, 239, 255);">]</span><span style="color: rgb(225, 239, 255);">)</span> <span style="color: rgb(255, 157, 0);">+</span> <span style="color: rgb(58, 217, 0);">&quot;</span>(<span style="color: rgb(58, 217, 0);">&quot;</span> <span style="color: rgb(255, 157, 0);">+</span> <span style="color: rgb(128, 255, 187);">str</span><span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(128, 255, 187);">int</span><span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(255, 98, 140);">100</span><span style="color: rgb(255, 157, 0);">*</span>element<span style="color: rgb(225, 239, 255);">[</span><span style="color: rgb(255, 98, 140);">1</span><span style="color: rgb(225, 239, 255);">]</span><span style="color: rgb(225, 239, 255);">)</span><span style="color: rgb(225, 239, 255);">)</span> <span style="color: rgb(255, 157, 0);">+</span> <span style="color: rgb(58, 217, 0);">&quot;</span>%)  <span style="color: rgb(58, 217, 0);">&quot;</span> <span style="color: rgb(255, 157, 0);">for</span> element <span style="color: rgb(255, 157, 0);">in</span> top5<span style="color: rgb(225, 239, 255);">]</span><span style="color: rgb(225, 239, 255);">)</span></span>

        <span style="color: rgb(0, 136, 255); font-style: italic;"><span style="color: rgb(225, 239, 255);">#</span> Output the text on command line</span>
        <span style="color: rgb(255, 157, 0);">print</span>(text)

<span style="color: rgb(255, 157, 0);">if</span> <span style="color: rgb(128, 255, 187);">__name__</span> <span style="color: rgb(255, 157, 0);">==</span> <span style="color: rgb(58, 217, 0);">&quot;</span>__main__<span style="color: rgb(58, 217, 0);">&quot;</span>:
    <span style="color: rgb(255, 238, 128);">main<span style="color: rgb(225, 239, 255);">(</span><span style="color: rgb(225, 239, 255);">)</span></span></pre>
<p>
其中最关键的是，不再用&nbsp;cv2.imshow() 的方式来弹出GUI窗口，而是用 print(text) 的方式把结果打印到command line。<br />
经过实测，在我的老爷机上这个程序就完全不会僵死了。此时，你把摄像头对准哪个物体，它拍摄的就是哪个物体的图像，model也就是对这个物体的图像进行分类预测。<br />
<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
下面是我的一次test的command line输出：</p>
<blockquote>
<div>
		OpenBLAS : Your OS does not support AVX instructions. OpenBLAS is using Nehalem kernels as a fallback, which may give poorer performance.</div>
<div>
		Read a frame from camera, time 1</div>
<div>
		Frame 1 saved to disk</div>
<div>
		Read a frame from camera, time 2</div>
<div>
		Frame 2 saved to disk</div>
<div>
		Read a frame from camera, time 3</div>
<div>
		Frame 3 saved to disk</div>
<div>
		Read a frame from camera, time 4</div>
<div>
		Frame 4 saved to disk</div>
<div>
		Read a frame from camera, time 5</div>
<div>
		Frame 5 saved to disk</div>
<div>
		Loading...</div>
<div>
		Selected CPU as the process wide default device.</div>
<div>
		&nbsp;</div>
<div>
		Finished loading.</div>
<div>
		Pre-processing...</div>
<div>
		&nbsp;</div>
<div>
		Will not process Dropout - skipping this layer as irrelevant.</div>
<div>
		&nbsp;</div>
<div>
		Will not process Dropout - skipping this layer as irrelevant.</div>
<div>
		&nbsp;</div>
<div>
		Will not process Combine - skipping this layer as irrelevant.</div>
<div>
		Convolution : &nbsp;226x226x3 &nbsp;-&gt; &nbsp;224x224x64 | padding &nbsp;1</div>
<div>
		ReLU : &nbsp;224x224x64 &nbsp;-&gt; &nbsp;226x226x64 | padding &nbsp;0</div>
<div>
		Convolution : &nbsp;226x226x64 &nbsp;-&gt; &nbsp;224x224x64 | padding &nbsp;1</div>
<div>
		ReLU : &nbsp;224x224x64 &nbsp;-&gt; &nbsp;224x224x64 | padding &nbsp;0</div>
<div>
		Pooling : &nbsp;224x224x64 &nbsp;-&gt; &nbsp;114x114x64 | padding &nbsp;0</div>
<div>
		Convolution : &nbsp;114x114x64 &nbsp;-&gt; &nbsp;112x112x128 | padding &nbsp;1</div>
<div>
		ReLU : &nbsp;112x112x128 &nbsp;-&gt; &nbsp;114x114x128 | padding &nbsp;0</div>
<div>
		Convolution : &nbsp;114x114x128 &nbsp;-&gt; &nbsp;112x112x128 | padding &nbsp;1</div>
<div>
		ReLU : &nbsp;112x112x128 &nbsp;-&gt; &nbsp;112x112x128 | padding &nbsp;0</div>
<div>
		Pooling : &nbsp;112x112x128 &nbsp;-&gt; &nbsp;58x58x128 | padding &nbsp;0</div>
<div>
		Convolution : &nbsp;58x58x128 &nbsp;-&gt; &nbsp;56x56x256 | padding &nbsp;1</div>
<div>
		ReLU : &nbsp;56x56x256 &nbsp;-&gt; &nbsp;58x58x256 | padding &nbsp;0</div>
<div>
		Convolution : &nbsp;58x58x256 &nbsp;-&gt; &nbsp;56x56x256 | padding &nbsp;1</div>
<div>
		ReLU : &nbsp;56x56x256 &nbsp;-&gt; &nbsp;58x58x256 | padding &nbsp;0</div>
<div>
		Convolution : &nbsp;58x58x256 &nbsp;-&gt; &nbsp;56x56x256 | padding &nbsp;1</div>
<div>
		ReLU : &nbsp;56x56x256 &nbsp;-&gt; &nbsp;56x56x256 | padding &nbsp;0</div>
<div>
		Pooling : &nbsp;56x56x256 &nbsp;-&gt; &nbsp;30x30x256 | padding &nbsp;0</div>
<div>
		Convolution : &nbsp;30x30x256 &nbsp;-&gt; &nbsp;28x28x512 | padding &nbsp;1</div>
<div>
		ReLU : &nbsp;28x28x512 &nbsp;-&gt; &nbsp;30x30x512 | padding &nbsp;0</div>
<div>
		Convolution : &nbsp;30x30x512 &nbsp;-&gt; &nbsp;28x28x512 | padding &nbsp;1</div>
<div>
		ReLU : &nbsp;28x28x512 &nbsp;-&gt; &nbsp;30x30x512 | padding &nbsp;0</div>
<div>
		Convolution : &nbsp;30x30x512 &nbsp;-&gt; &nbsp;28x28x512 | padding &nbsp;1</div>
<div>
		ReLU : &nbsp;28x28x512 &nbsp;-&gt; &nbsp;28x28x512 | padding &nbsp;0</div>
<div>
		Pooling : &nbsp;28x28x512 &nbsp;-&gt; &nbsp;16x16x512 | padding &nbsp;0</div>
<div>
		Convolution : &nbsp;16x16x512 &nbsp;-&gt; &nbsp;14x14x512 | padding &nbsp;1</div>
<div>
		ReLU : &nbsp;14x14x512 &nbsp;-&gt; &nbsp;16x16x512 | padding &nbsp;0</div>
<div>
		Convolution : &nbsp;16x16x512 &nbsp;-&gt; &nbsp;14x14x512 | padding &nbsp;1</div>
<div>
		ReLU : &nbsp;14x14x512 &nbsp;-&gt; &nbsp;16x16x512 | padding &nbsp;0</div>
<div>
		Convolution : &nbsp;16x16x512 &nbsp;-&gt; &nbsp;14x14x512 | padding &nbsp;1</div>
<div>
		ReLU : &nbsp;14x14x512 &nbsp;-&gt; &nbsp;14x14x512 | padding &nbsp;0</div>
<div>
		Pooling : &nbsp;14x14x512 &nbsp;-&gt; &nbsp;7x7x512 | padding &nbsp;0</div>
<div>
		linear : &nbsp;7x7x512 &nbsp;-&gt; &nbsp;1x1x4096 | padding &nbsp;0</div>
<div>
		ReLU : &nbsp;1x1x4096 &nbsp;-&gt; &nbsp;1x1x4096 | padding &nbsp;0</div>
<div>
		linear : &nbsp;1x1x4096 &nbsp;-&gt; &nbsp;1x1x4096 | padding &nbsp;0</div>
<div>
		ReLU : &nbsp;1x1x4096 &nbsp;-&gt; &nbsp;1x1x4096 | padding &nbsp;0</div>
<div>
		linear : &nbsp;1x1x4096 &nbsp;-&gt; &nbsp;1x1x1000 | padding &nbsp;0</div>
<div>
		Softmax : &nbsp;1x1x1000 &nbsp;-&gt; &nbsp;1x1x1000 | padding &nbsp;0</div>
<div>
		&nbsp;</div>
<div>
		Finished pre-processing.</div>
<div>
		&nbsp;</div>
<div>
		Constructing equivalent ELL layers from CNTK...</div>
<div>
		Converting layer &nbsp;conv1_1: Convolution(data: Tensor[3,224,224]) -&gt; Tensor[64,224,224]</div>
<div>
		Converting layer &nbsp;relu1_1: ReLU(conv1_1: Tensor[64,224,224]) -&gt; Tensor[64,224,224]</div>
<div>
		Converting layer &nbsp;conv1_2: Convolution(relu1_1: Tensor[64,224,224]) -&gt; Tensor[64,224,224]</div>
<div>
		Converting layer &nbsp;relu1_2: ReLU(conv1_2: Tensor[64,224,224]) -&gt; Tensor[64,224,224]</div>
<div>
		Converting layer &nbsp;pool1: Pooling(relu1_2: Tensor[64,224,224]) -&gt; Tensor[64,112,112]</div>
<div>
		Converting layer &nbsp;conv2_1: Convolution(pool1: Tensor[64,112,112]) -&gt; Tensor[128,112,112]</div>
<div>
		Converting layer &nbsp;relu2_1: ReLU(conv2_1: Tensor[128,112,112]) -&gt; Tensor[128,112,112]</div>
<div>
		Converting layer &nbsp;conv2_2: Convolution(relu2_1: Tensor[128,112,112]) -&gt; Tensor[128,112,112]</div>
<div>
		Converting layer &nbsp;relu2_2: ReLU(conv2_2: Tensor[128,112,112]) -&gt; Tensor[128,112,112]</div>
<div>
		Converting layer &nbsp;pool2: Pooling(relu2_2: Tensor[128,112,112]) -&gt; Tensor[128,56,56]</div>
<div>
		Converting layer &nbsp;conv3_1: Convolution(pool2: Tensor[128,56,56]) -&gt; Tensor[256,56,56]</div>
<div>
		Converting layer &nbsp;relu3_1: ReLU(conv3_1: Tensor[256,56,56]) -&gt; Tensor[256,56,56]</div>
<div>
		Converting layer &nbsp;conv3_2: Convolution(relu3_1: Tensor[256,56,56]) -&gt; Tensor[256,56,56]</div>
<div>
		Converting layer &nbsp;relu3_2: ReLU(conv3_2: Tensor[256,56,56]) -&gt; Tensor[256,56,56]</div>
<div>
		Converting layer &nbsp;conv3_3: Convolution(relu3_2: Tensor[256,56,56]) -&gt; Tensor[256,56,56]</div>
<div>
		Converting layer &nbsp;relu3_3: ReLU(conv3_3: Tensor[256,56,56]) -&gt; Tensor[256,56,56]</div>
<div>
		Converting layer &nbsp;pool3: Pooling(relu3_3: Tensor[256,56,56]) -&gt; Tensor[256,28,28]</div>
<div>
		Converting layer &nbsp;conv4_1: Convolution(pool3: Tensor[256,28,28]) -&gt; Tensor[512,28,28]</div>
<div>
		Converting layer &nbsp;relu4_1: ReLU(conv4_1: Tensor[512,28,28]) -&gt; Tensor[512,28,28]</div>
<div>
		Converting layer &nbsp;conv4_2: Convolution(relu4_1: Tensor[512,28,28]) -&gt; Tensor[512,28,28]</div>
<div>
		Converting layer &nbsp;relu4_2: ReLU(conv4_2: Tensor[512,28,28]) -&gt; Tensor[512,28,28]</div>
<div>
		Converting layer &nbsp;conv4_3: Convolution(relu4_2: Tensor[512,28,28]) -&gt; Tensor[512,28,28]</div>
<div>
		Converting layer &nbsp;relu4_3: ReLU(conv4_3: Tensor[512,28,28]) -&gt; Tensor[512,28,28]</div>
<div>
		Converting layer &nbsp;pool4: Pooling(relu4_3: Tensor[512,28,28]) -&gt; Tensor[512,14,14]</div>
<div>
		Converting layer &nbsp;conv5_1: Convolution(pool4: Tensor[512,14,14]) -&gt; Tensor[512,14,14]</div>
<div>
		Converting layer &nbsp;relu5_1: ReLU(conv5_1: Tensor[512,14,14]) -&gt; Tensor[512,14,14]</div>
<div>
		Converting layer &nbsp;conv5_2: Convolution(relu5_1: Tensor[512,14,14]) -&gt; Tensor[512,14,14]</div>
<div>
		Converting layer &nbsp;relu5_2: ReLU(conv5_2: Tensor[512,14,14]) -&gt; Tensor[512,14,14]</div>
<div>
		Converting layer &nbsp;conv5_3: Convolution(relu5_2: Tensor[512,14,14]) -&gt; Tensor[512,14,14]</div>
<div>
		Converting layer &nbsp;relu5_3: ReLU(conv5_3: Tensor[512,14,14]) -&gt; Tensor[512,14,14]</div>
<div>
		Converting layer &nbsp;pool5: Pooling(relu5_3: Tensor[512,14,14]) -&gt; Tensor[512,7,7]</div>
<div>
		Converting layer &nbsp;fc6: linear(pool5: Tensor[512,7,7]) -&gt; Tensor[4096]</div>
<div>
		Converting layer &nbsp;relu6: ReLU(fc6: Tensor[4096]) -&gt; Tensor[4096]</div>
<div>
		Converting layer &nbsp;fc7: linear(drop6: Tensor[4096]) -&gt; Tensor[4096]</div>
<div>
		Converting layer &nbsp;relu7: ReLU(fc7: Tensor[4096]) -&gt; Tensor[4096]</div>
<div>
		Converting layer &nbsp;fc8: linear(drop7: Tensor[4096]) -&gt; Tensor[1000]</div>
<div>
		Converting layer &nbsp;prob: Softmax(fc8: Tensor[1000]) -&gt; Tensor[1000]</div>
<div>
		&nbsp;</div>
<div>
		...Finished constructing ELL layers.</div>
<div>
		lighter, light, igniter, ignitor(28%) &nbsp;</div>
<div>
		lighter, light, igniter, ignitor(28%) &nbsp;</div>
<div>
		&nbsp;</div>
<div>
		lighter, light, igniter, ignitor(32%) &nbsp;</div>
<div>
		lighter, light, igniter, ignitor(30%)</div>
<div>
		......</div>
</blockquote>
<div>
	有人可能会说这样做不直观，根本无法肯定摄像头当前正在拍摄的是什么东西。如果你非要看图片的话，倒是有一个折中的办法，就是用&nbsp;<span style="color:#0000ff;">cv2.imwrite(&#39;/home/codelast/current.jpg&#39;, frame)</span> 把抓取的一帧图像保存到磁盘上，然后自己去打开文件看吧。</p>
<p>	最后不得不感叹一下，我的台式机真的是太老了，跑这个demo真的很慢，没有个5分钟以上是根本没可能到开始预测的步骤的。<br />
	<span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
	<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;版权声明&nbsp;<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;<br />
	转载需注明出处：<u><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><em><span style="color: rgb(0, 0, 255);"><strong style="font-size: 16px;"><span style="font-family: arial, helvetica, sans-serif;">codelast.com</span></strong></span></em></a></u>&nbsp;<br />
	感谢关注我的微信公众号（微信扫一扫）：</p>
<p style="border: 0px; font-size: 13px; margin: 0px 0px 9px; outline: 0px; padding: 0px; color: rgb(77, 77, 77);">
		<img decoding="async" alt="wechat qrcode of codelast" src="https://www.codelast.com/codelast_wechat_qr_code.jpg" style="width: 200px; height: 200px;" /></p>
</div>
]]></content:encoded>
					
					<wfw:commentRss>https://www.codelast.com/%e5%8e%9f%e5%88%9b-%e6%89%a7%e8%a1%8cell%e7%9a%84demo%e7%a8%8b%e5%ba%8fcntkdemo-py%e6%97%b6%e7%a8%8b%e5%ba%8f%e5%83%b5%e6%ad%bb%e7%9a%84%e9%97%ae%e9%a2%98/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
