<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>prediction &#8211; 编码无悔 /  Intent &amp; Focused</title>
	<atom:link href="https://www.codelast.com/tag/prediction/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.codelast.com</link>
	<description>最优化之路</description>
	<lastBuildDate>Mon, 04 May 2020 14:25:51 +0000</lastBuildDate>
	<language>zh-Hans</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>[原创] PyTorch做inference/prediction的时候如何使用GPU</title>
		<link>https://www.codelast.com/%e5%8e%9f%e5%88%9b-pytorch%e5%81%9ainference-prediction%e7%9a%84%e6%97%b6%e5%80%99%e5%a6%82%e4%bd%95%e4%bd%bf%e7%94%a8gpu/</link>
					<comments>https://www.codelast.com/%e5%8e%9f%e5%88%9b-pytorch%e5%81%9ainference-prediction%e7%9a%84%e6%97%b6%e5%80%99%e5%a6%82%e4%bd%95%e4%bd%bf%e7%94%a8gpu/#respond</comments>
		
		<dc:creator><![CDATA[learnhard]]></dc:creator>
		<pubDate>Mon, 27 Apr 2020 11:50:56 +0000</pubDate>
				<category><![CDATA[Algorithm]]></category>
		<category><![CDATA[原创]]></category>
		<category><![CDATA[GPU]]></category>
		<category><![CDATA[inference]]></category>
		<category><![CDATA[prediction]]></category>
		<category><![CDATA[PyTorch]]></category>
		<guid isPermaLink="false">https://www.codelast.com/?p=12047</guid>

					<description><![CDATA[<p>话不多说，直接进入主题。</p>
<p><span style="color:#0000ff;">✔</span> 判断能不能使用GPU<br />
可能有多种原因会导致不能使用GPU，比如PyTorch安装的是CPU版的，显卡驱动没有正确安装等。下面的 if 语句在正常的情况下会返回 True：</p>
<pre style="background-color:#2b2b2b;color:#a9b7c6;font-family:'JetBrains Mono';font-size:13.5pt;">
<span style="color:#cc7832;">if </span>torch.cuda.is_available():
    <span style="color:#8888c6;">print</span>(<span style="color:#6a8759;">&#39;PyTorch can use GPU on current machine!&#39;</span>)</pre>
<p><span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
<span style="color: rgb(0, 0, 255);">✔</span>&#160;设置模型使用GPU</p>
<pre style="background-color:#2b2b2b;color:#a9b7c6;font-family:'JetBrains Mono';font-size:13.5pt;">
model = MyModel(*args<span style="color:#cc7832;">, </span>**kwargs)
model.load_state_dict(torch.load(your_model_file_path))
model.eval()  <span style="color:#808080;"># </span><span style="color:#808080;font-family:'DejaVu Sans Mono';">设置成</span><span style="color:#808080;">evaluation</span><span style="color:#808080;font-family:'DejaVu Sans Mono';">模式
</span><span style="color:#cc7832;">if </span>torch.cuda.is_available():
    <span style="color:#8888c6;">print</span>(<span style="color:#6a8759;">&#39;PyTorch can use GPU on current machine!&#39;</span></pre>&#8230; <a href="https://www.codelast.com/%e5%8e%9f%e5%88%9b-pytorch%e5%81%9ainference-prediction%e7%9a%84%e6%97%b6%e5%80%99%e5%a6%82%e4%bd%95%e4%bd%bf%e7%94%a8gpu/" class="read-more">Read More </a>]]></description>
										<content:encoded><![CDATA[<p>话不多说，直接进入主题。</p>
<p><span style="color:#0000ff;"><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2714.png" alt="✔" class="wp-smiley" style="height: 1em; max-height: 1em;" /></span> 判断能不能使用GPU<br />
可能有多种原因会导致不能使用GPU，比如PyTorch安装的是CPU版的，显卡驱动没有正确安装等。下面的 if 语句在正常的情况下会返回 True：</p>
<pre style="background-color:#2b2b2b;color:#a9b7c6;font-family:'JetBrains Mono';font-size:13.5pt;">
<span style="color:#cc7832;">if </span>torch.cuda.is_available():
    <span style="color:#8888c6;">print</span>(<span style="color:#6a8759;">&#39;PyTorch can use GPU on current machine!&#39;</span>)</pre>
<p><span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
<span style="color: rgb(0, 0, 255);"><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2714.png" alt="✔" class="wp-smiley" style="height: 1em; max-height: 1em;" /></span>&nbsp;设置模型使用GPU</p>
<pre style="background-color:#2b2b2b;color:#a9b7c6;font-family:'JetBrains Mono';font-size:13.5pt;">
model = MyModel(*args<span style="color:#cc7832;">, </span>**kwargs)
model.load_state_dict(torch.load(your_model_file_path))
model.eval()  <span style="color:#808080;"># </span><span style="color:#808080;font-family:'DejaVu Sans Mono';">设置成</span><span style="color:#808080;">evaluation</span><span style="color:#808080;font-family:'DejaVu Sans Mono';">模式
</span><span style="color:#cc7832;">if </span>torch.cuda.is_available():
    <span style="color:#8888c6;">print</span>(<span style="color:#6a8759;">&#39;PyTorch can use GPU on current machine!&#39;</span>)
    device = torch.device(<span style="color:#6a8759;">&quot;cuda&quot;</span>)
    model.to(device)</pre>
<p>your_model_file_path 是模型文件的路径。<br />
<span id="more-12047"></span><br />
<span style="color: rgb(0, 0, 255);"><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2714.png" alt="✔" class="wp-smiley" style="height: 1em; max-height: 1em;" /></span>&nbsp;inference/predict的时候使用GPU<br />
对一次 inference 来说，假设模型的输入数据为&nbsp;model_input_tensor（torch.Tensor类型），那么计算模型输出的方法是：</p>
<pre style="background-color:#2b2b2b;color:#a9b7c6;font-family:'JetBrains Mono';font-size:13.5pt;">
<span style="color:#cc7832;">if </span>torch.cuda.is_available():  <span style="color:#808080;"># GPU available
</span><span style="color:#808080;">    </span>model_input_tensor = model_input_tensor.to(torch.device(<span style="color:#6a8759;">&#39;cuda&#39;</span>))
model_output = model(model_input_tensor)  <span style="color:#808080;"># inference</span></pre>
<p>
<span style="color: rgb(0, 0, 255);"><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2714.png" alt="✔" class="wp-smiley" style="height: 1em; max-height: 1em;" /></span>&nbsp;检查程序跑起来之后是不是真的用了GPU<br />
用 nvidia-smi 命令来查看。</p>
<p style="border: 0px; font-size: 13px; margin: 0px 0px 9px; outline: 0px; padding: 0px; color: rgb(77, 77, 77);">
	&nbsp;</p>
<p><span style="color: rgb(255, 255, 255);">文章来源：</span><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><span style="color: rgb(255, 255, 255);">https://www.codelast.com/</span></a><br />
<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;版权声明&nbsp;<span style="color: rgb(255, 0, 0);">➤➤</span>&nbsp;<br />
转载需注明出处：<u><a href="https://www.codelast.com/" rel="noopener noreferrer" target="_blank"><em><span style="color: rgb(0, 0, 255);"><strong style="font-size: 16px;"><span style="font-family: arial, helvetica, sans-serif;">codelast.com</span></strong></span></em></a></u>&nbsp;<br />
感谢关注我的微信公众号（微信扫一扫）：</p>
<p style="border: 0px; font-size: 13px; margin: 0px 0px 9px; outline: 0px; padding: 0px; color: rgb(77, 77, 77);">
	<img decoding="async" alt="wechat qrcode of codelast" src="https://www.codelast.com/codelast_wechat_qr_code.jpg" style="width: 200px; height: 200px;" /></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.codelast.com/%e5%8e%9f%e5%88%9b-pytorch%e5%81%9ainference-prediction%e7%9a%84%e6%97%b6%e5%80%99%e5%a6%82%e4%bd%95%e4%bd%bf%e7%94%a8gpu/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
