当前位置: 首页 > news >正文

入坑tensorflow

win10 CPU版,anaconda prompt命令行一句话,pip install --upgrade tensorflow搞定。比caffe好装一万倍。

gpu版没装成,首先这个笔记本没装cuda,另外一个台式装好了cuda8.0和cunn5.1也是报一样的错误,缺少一个setuptool.egg

命令行如下:

  1 (D:\Users\song\Anaconda3) C:\SPB_Data>python --version
  2 Python 3.6.0 :: Anaconda 4.3.1 (64-bit)
  3 
  4 (D:\Users\song\Anaconda3) C:\SPB_Data>pip -V
  5 pip 9.0.1 from D:\Users\song\Anaconda3\lib\site-packages (python 3.6)
  6 
  7 (D:\Users\song\Anaconda3) C:\SPB_Data>pip3 install --upgrade tensorflow-gpu
  8 'pip3' 不是内部或外部命令,也不是可运行的程序
  9 或批处理文件。
 10 
 11 (D:\Users\song\Anaconda3) C:\SPB_Data>pip install --upgrade tensorflow-gpu
 12 Collecting tensorflow-gpu
 13   Downloading tensorflow_gpu-1.3.0-cp36-cp36m-win_amd64.whl (60.0MB)
 14     100% |████████████████████████████████| 60.0MB 16kB/s
 15 Collecting numpy>=1.11.0 (from tensorflow-gpu)
 16   Downloading numpy-1.13.3-cp36-none-win_amd64.whl (13.1MB)
 17     100% |████████████████████████████████| 13.1MB 73kB/s
 18 Collecting protobuf>=3.3.0 (from tensorflow-gpu)
 19   Downloading protobuf-3.4.0-py2.py3-none-any.whl (375kB)
 20     100% |████████████████████████████████| 378kB 667kB/s
 21 Collecting wheel>=0.26 (from tensorflow-gpu)
 22   Downloading wheel-0.30.0-py2.py3-none-any.whl (49kB)
 23     100% |████████████████████████████████| 51kB 371kB/s
 24 Collecting tensorflow-tensorboard<0.2.0,>=0.1.0 (from tensorflow-gpu)
 25   Downloading tensorflow_tensorboard-0.1.8-py3-none-any.whl (1.6MB)
 26     100% |████████████████████████████████| 1.6MB 413kB/s
 27 Collecting six>=1.10.0 (from tensorflow-gpu)
 28   Downloading six-1.11.0-py2.py3-none-any.whl
 29 Collecting setuptools (from protobuf>=3.3.0->tensorflow-gpu)
 30   Downloading setuptools-36.6.0-py2.py3-none-any.whl (481kB)
 31     100% |████████████████████████████████| 481kB 734kB/s
 32 Collecting bleach==1.5.0 (from tensorflow-tensorboard<0.2.0,>=0.1.0->tensorflow-gpu)
 33   Downloading bleach-1.5.0-py2.py3-none-any.whl
 34 Collecting werkzeug>=0.11.10 (from tensorflow-tensorboard<0.2.0,>=0.1.0->tensorflow-gpu)
 35   Downloading Werkzeug-0.12.2-py2.py3-none-any.whl (312kB)
 36     100% |████████████████████████████████| 317kB 1.7MB/s
 37 Collecting html5lib==0.9999999 (from tensorflow-tensorboard<0.2.0,>=0.1.0->tensorflow-gpu)
 38   Downloading html5lib-0.9999999.tar.gz (889kB)
 39     100% |████████████████████████████████| 890kB 502kB/s
 40 Collecting markdown>=2.6.8 (from tensorflow-tensorboard<0.2.0,>=0.1.0->tensorflow-gpu)
 41   Downloading Markdown-2.6.9.tar.gz (271kB)
 42     100% |████████████████████████████████| 276kB 687kB/s
 43 Building wheels for collected packages: html5lib, markdown
 44   Running setup.py bdist_wheel for html5lib ... done
 45   Stored in directory: C:\Users\song\AppData\Local\pip\Cache\wheels\6f\85\6c\56b8e1292c6214c4eb73b9dda50f53e8e977bf65989373c962
 46   Running setup.py bdist_wheel for markdown ... done
 47   Stored in directory: C:\Users\song\AppData\Local\pip\Cache\wheels\bf\46\10\c93e17ae86ae3b3a919c7b39dad3b5ccf09aeb066419e5c1e5
 48 Successfully built html5lib markdown
 49 Installing collected packages: numpy, setuptools, six, protobuf, wheel, html5lib, bleach, werkzeug, markdown, tensorflow-tensorboard, tensorflow-gpu
 50   Found existing installation: numpy 1.11.3
 51     Uninstalling numpy-1.11.3:
 52       Successfully uninstalled numpy-1.11.3
 53   Found existing installation: setuptools 27.2.0
 54     Uninstalling setuptools-27.2.0:
 55       Successfully uninstalled setuptools-27.2.0
 56   Found existing installation: six 1.10.0
 57     DEPRECATION: Uninstalling a distutils installed project (six) has been deprecated and will be removed in a future version. This is due to the fact that uninstalling a distutils project will only partially uninstall the project.
 58     Uninstalling six-1.10.0:
 59       Successfully uninstalled six-1.10.0
 60   Found existing installation: wheel 0.29.0
 61     Uninstalling wheel-0.29.0:
 62       Successfully uninstalled wheel-0.29.0
 63   Found existing installation: Werkzeug 0.11.15
 64     Uninstalling Werkzeug-0.11.15:
 65       Successfully uninstalled Werkzeug-0.11.15
 66 Successfully installed bleach-1.5.0 html5lib-0.9999999 markdown-2.6.9 numpy-1.13.3 protobuf-3.4.0 setuptools-36.6.0 six-1.11.0 tensorflow-gpu-1.3.0 tensorflow-tensorboard-0.1.8 werkzeug-0.12.2 wheel-0.30.0
 67 Traceback (most recent call last):
 68   File "D:\Users\song\Anaconda3\Scripts\pip-script.py", line 5, in <module>
 69     sys.exit(pip.main())
 70   File "D:\Users\song\Anaconda3\lib\site-packages\pip\__init__.py", line 249, in main
 71     return command.main(cmd_args)
 72   File "D:\Users\song\Anaconda3\lib\site-packages\pip\basecommand.py", line 252, in main
 73     pip_version_check(session)
 74   File "D:\Users\song\Anaconda3\lib\site-packages\pip\utils\outdated.py", line 102, in pip_version_check
 75     installed_version = get_installed_version("pip")
 76   File "D:\Users\song\Anaconda3\lib\site-packages\pip\utils\__init__.py", line 838, in get_installed_version
 77     working_set = pkg_resources.WorkingSet()
 78   File "D:\Users\song\Anaconda3\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 644, in __init__
 79     self.add_entry(entry)
 80   File "D:\Users\song\Anaconda3\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 700, in add_entry
 81     for dist in find_distributions(entry, True):
 82   File "D:\Users\song\Anaconda3\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 1949, in find_eggs_in_zip
 83     if metadata.has_metadata('PKG-INFO'):
 84   File "D:\Users\song\Anaconda3\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 1463, in has_metadata
 85     return self.egg_info and self._has(self._fn(self.egg_info, name))
 86   File "D:\Users\song\Anaconda3\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 1823, in _has
 87     return zip_path in self.zipinfo or zip_path in self._index()
 88   File "D:\Users\song\Anaconda3\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 1703, in zipinfo
 89     return self._zip_manifests.load(self.loader.archive)
 90   File "D:\Users\song\Anaconda3\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 1643, in load
 91     mtime = os.stat(path).st_mtime
 92 FileNotFoundError: [WinError 2] 系统找不到指定的文件。: 'D:\\Users\\song\\Anaconda3\\lib\\site-packages\\setuptools-27.2.0-py3.6.egg'
 93 
 94 (D:\Users\song\Anaconda3) C:\SPB_Data>cd ..
 95 
 96 (D:\Users\song\Anaconda3) C:\>cd ..
 97 
 98 (D:\Users\song\Anaconda3) C:\>ls
 99 'ls' 不是内部或外部命令,也不是可运行的程序
100 或批处理文件。
101 
102 (D:\Users\song\Anaconda3) C:\>python --version
103 Python 3.6.0 :: Anaconda 4.3.1 (64-bit)
104 
105 (D:\Users\song\Anaconda3) C:\>nvcc -V
106 'nvcc' 不是内部或外部命令,也不是可运行的程序
107 或批处理文件。
108 
109 (D:\Users\song\Anaconda3) C:\>pip install --upgrade tensorflow
110 Collecting tensorflow
111   Downloading tensorflow-1.3.0-cp36-cp36m-win_amd64.whl (25.5MB)
112     100% |████████████████████████████████| 25.5MB 29kB/s
113 Requirement already up-to-date: protobuf>=3.3.0 in d:\users\song\anaconda3\lib\site-packages (from tensorflow)
114 Requirement already up-to-date: wheel>=0.26 in d:\users\song\anaconda3\lib\site-packages (from tensorflow)
115 Requirement already up-to-date: tensorflow-tensorboard<0.2.0,>=0.1.0 in d:\users\song\anaconda3\lib\site-packages (from tensorflow)
116 Requirement already up-to-date: six>=1.10.0 in d:\users\song\anaconda3\lib\site-packages (from tensorflow)
117 Requirement already up-to-date: numpy>=1.11.0 in d:\users\song\anaconda3\lib\site-packages (from tensorflow)
118 Requirement already up-to-date: setuptools in d:\users\song\anaconda3\lib\site-packages (from protobuf>=3.3.0->tensorflow)
119 Requirement already up-to-date: markdown>=2.6.8 in d:\users\song\anaconda3\lib\site-packages (from tensorflow-tensorboard<0.2.0,>=0.1.0->tensorflow)
120 Requirement already up-to-date: bleach==1.5.0 in d:\users\song\anaconda3\lib\site-packages (from tensorflow-tensorboard<0.2.0,>=0.1.0->tensorflow)
121 Requirement already up-to-date: html5lib==0.9999999 in d:\users\song\anaconda3\lib\site-packages (from tensorflow-tensorboard<0.2.0,>=0.1.0->tensorflow)
122 Requirement already up-to-date: werkzeug>=0.11.10 in d:\users\song\anaconda3\lib\site-packages (from tensorflow-tensorboard<0.2.0,>=0.1.0->tensorflow)
123 Installing collected packages: tensorflow
124 Successfully installed tensorflow-1.3.0
125 
126 (D:\Users\song\Anaconda3) C:\>import tensorflow as tf
127 'import' 不是内部或外部命令,也不是可运行的程序
128 或批处理文件。
129 
130 (D:\Users\song\Anaconda3) C:\>python
131 Python 3.6.0 |Anaconda 4.3.1 (64-bit)| (default, Dec 23 2016, 11:57:41) [MSC v.1900 64 bit (AMD64)] on win32
132 Type "help", "copyright", "credits" or "license" for more information.
133 >>> import tensorflow as tf
134 >>> a = tf.random_normal((100,100))
135 >>> b = tf.random_normal((100,500))
136 >>> c=tf.matmul(a,b)
137 >>> sess=tf.InteractiveSession()
138 2017-10-29 20:46:03.615036: W C:\tf_jenkins\home\workspace\rel-win\M\windows\PY\36\tensorflow\core\platform\cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could speed up CPU computations.
139 2017-10-29 20:46:03.620666: W C:\tf_jenkins\home\workspace\rel-win\M\windows\PY\36\tensorflow\core\platform\cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX2 instructions, but these are available on your machine and could speed up CPU computations.
140 >>> sess.run(c)
141 array([[ -2.01546478e+01,  -1.21840429e+01,   8.52634966e-01, ...,
142          -1.93460350e+01,  -1.17136412e+01,  -2.81856956e+01],
143        [ -2.86180496e+00,   1.86777287e+01,   2.39728212e-01, ...,
144           1.65606441e+01,  -8.35585117e+00,   1.21092701e+01],
145        [ -6.70668936e+00,  -1.92020512e+00,  -8.63678837e+00, ...,
146           1.19851971e+01,  -1.95774388e+00,  -3.46706104e+00],
147        ...,
148        [ -6.20419502e+00,  -1.58898029e+01,   1.47155542e+01, ...,
149          -6.35781908e+00,  -7.09256840e+00,   1.04180880e+01],
150        [ -1.14867371e-03,  -2.47349381e+00,   1.40450490e+00, ...,
151           1.87805653e+00,   7.70393276e+00,  -1.11452806e+00],
152        [ -1.81114292e+01,   2.83652916e+01,   2.23067703e+01, ...,
153           4.72095060e+00,   2.01743245e+00,   9.46466255e+00]], dtype=float32)
154 >>> c
155 <tf.Tensor 'MatMul:0' shape=(100, 500) dtype=float32>
156 >>> print(c)
157 Tensor("MatMul:0", shape=(100, 500), dtype=float32)
158 >>> print(c.val)
159 Traceback (most recent call last):
160   File "<stdin>", line 1, in <module>
161 AttributeError: 'Tensor' object has no attribute 'val'
162 >>> print(c.eval())
163 [[  7.44645548e+00   7.01777339e-01  -3.29522681e+00 ...,  -4.11035490e+00
164     6.88585615e+00  -1.03243275e+01]
165  [  1.74935007e+00  -8.06512642e+00  -8.94767094e+00 ...,  -8.51691341e+00
166    -6.86603403e+00   9.46757889e+00]
167  [ -6.61030436e+00   5.86357307e+00   1.51259956e+01 ...,  -9.53737926e+00
168     1.95381641e-02   1.16717541e+00]
169  ...,
170  [ -5.34449625e+00   1.13798809e+00   1.34737101e+01 ...,   6.86746025e+00
171     3.37234330e+00  -9.16017354e-01]
172  [ -3.89829564e+00   1.19947767e+00   9.16424465e+00 ...,   7.61591375e-01
173    -1.70225441e-01   1.02892227e+01]
174  [  1.97680518e-01  -1.99925423e+01  -9.40755844e+00 ...,   5.44214249e+00
175     1.52138865e+00   2.48984170e+00]]
176 >>> print(a)
177 Tensor("random_normal:0", shape=(100, 100), dtype=float32)
178 >>> sess=tf.InteractiveSession()
179 >>> print(sess.run(a))
180 [[-1.394485   -1.95048952  0.76553309 ..., -0.43924141 -1.21975422
181    0.60572529]
182  [ 0.34292024  0.86016667 -2.25437665 ...,  1.67957187  1.57846153
183   -1.53106809]
184  [ 0.08453497  0.59995687 -1.37805259 ..., -0.92989731 -0.07856822
185   -1.36062932]
186  ...,
187  [-0.41187105  0.60689414 -0.44695681 ...,  0.51408201 -1.49676847
188    0.95741159]
189  [-1.01903558 -1.24220276  0.12283699 ...,  0.53144586 -0.2782338
190    0.34964591]
191  [ 0.27783027  0.5017578  -1.0619179  ...,  0.4974283  -0.04771407
192    0.48028085]]
193 >>> ls
194 Traceback (most recent call last):
195   File "<stdin>", line 1, in <module>
196 NameError: name 'ls' is not defined
197 >>> exit()
198 
199 (D:\Users\song\Anaconda3) C:\>e:\
200 'e:\' 不是内部或外部命令,也不是可运行的程序
201 或批处理文件。
202 
203 (D:\Users\song\Anaconda3) C:\>cd e:\
204 
205 (D:\Users\song\Anaconda3) C:\>python minst.py
206   File "minst.py", line 16
207 SyntaxError: Non-UTF-8 code starting with '\xb0' in file minst.py on line 16, but no encoding declared; see http://python.org/dev/peps/pep-0263/ for details
208 
209 (D:\Users\song\Anaconda3) C:\>python minst.py
210   File "minst.py", line 16
211 SyntaxError: Non-UTF-8 code starting with '\xb0' in file minst.py on line 16, but no encoding declared; see http://python.org/dev/peps/pep-0263/ for details
212 
213 (D:\Users\song\Anaconda3) C:\>python minst.py
214   File "minst.py", line 16
215 SyntaxError: Non-UTF-8 code starting with '\xb0' in file minst.py on line 16, but no encoding declared; see http://python.org/dev/peps/pep-0263/ for details
216 
217 (D:\Users\song\Anaconda3) C:\>python
218 Python 3.6.0 |Anaconda 4.3.1 (64-bit)| (default, Dec 23 2016, 11:57:41) [MSC v.1900 64 bit (AMD64)] on win32
219 Type "help", "copyright", "credits" or "license" for more information.
220 >>> import tensorflow as tf
221 >>> flags = tf.app.flags
222 >>> FLAGS = flags.FLAGS
223 >>> flags.DEFINE_string('data_dir', '/tmp/data/', 'Directory for storing data')
224 >>>
225 >>> mnist = input_data.read_data_sets(FLAGS.data_dir, one_hot=True)
226 Traceback (most recent call last):
227   File "<stdin>", line 1, in <module>
228 NameError: name 'input_data' is not defined
229 >>> from __future__ import absolute_import
230 >>> from __future__ import division
231 >>> from __future__ import print_function
232 >>> from tensorflow.examples.tutorials.mnist import input_data
233 >>>
234 >>> mnist = input_data.read_data_sets(FLAGS.data_dir, one_hot=True)
235 Successfully downloaded train-images-idx3-ubyte.gz 9912422 bytes.
236 Extracting /tmp/data/train-images-idx3-ubyte.gz
237 Traceback (most recent call last):
238   File "D:\Users\song\Anaconda3\lib\urllib\request.py", line 1318, in do_open
239     encode_chunked=req.has_header('Transfer-encoding'))
240   File "D:\Users\song\Anaconda3\lib\http\client.py", line 1239, in request
241     self._send_request(method, url, body, headers, encode_chunked)
242   File "D:\Users\song\Anaconda3\lib\http\client.py", line 1285, in _send_request
243     self.endheaders(body, encode_chunked=encode_chunked)
244   File "D:\Users\song\Anaconda3\lib\http\client.py", line 1234, in endheaders
245     self._send_output(message_body, encode_chunked=encode_chunked)
246   File "D:\Users\song\Anaconda3\lib\http\client.py", line 1026, in _send_output
247     self.send(msg)
248   File "D:\Users\song\Anaconda3\lib\http\client.py", line 964, in send
249     self.connect()
250   File "D:\Users\song\Anaconda3\lib\http\client.py", line 1400, in connect
251     server_hostname=server_hostname)
252   File "D:\Users\song\Anaconda3\lib\ssl.py", line 401, in wrap_socket
253     _context=self, _session=session)
254   File "D:\Users\song\Anaconda3\lib\ssl.py", line 808, in __init__
255     self.do_handshake()
256   File "D:\Users\song\Anaconda3\lib\ssl.py", line 1061, in do_handshake
257     self._sslobj.do_handshake()
258   File "D:\Users\song\Anaconda3\lib\ssl.py", line 683, in do_handshake
259     self._sslobj.do_handshake()
260 ssl.SSLEOFError: EOF occurred in violation of protocol (_ssl.c:749)
261 
262 During handling of the above exception, another exception occurred:
263 
264 Traceback (most recent call last):
265   File "<stdin>", line 1, in <module>
266   File "D:\Users\song\Anaconda3\lib\site-packages\tensorflow\contrib\learn\python\learn\datasets\mnist.py", line 240, in read_data_sets
267     SOURCE_URL + TRAIN_LABELS)
268   File "D:\Users\song\Anaconda3\lib\site-packages\tensorflow\contrib\learn\python\learn\datasets\base.py", line 208, in maybe_download
269     temp_file_name, _ = urlretrieve_with_retry(source_url)
270   File "D:\Users\song\Anaconda3\lib\site-packages\tensorflow\contrib\learn\python\learn\datasets\base.py", line 165, in wrapped_fn
271     return fn(*args, **kwargs)
272   File "D:\Users\song\Anaconda3\lib\site-packages\tensorflow\contrib\learn\python\learn\datasets\base.py", line 190, in urlretrieve_with_retry
273     return urllib.request.urlretrieve(url, filename)
274   File "D:\Users\song\Anaconda3\lib\urllib\request.py", line 248, in urlretrieve
275     with contextlib.closing(urlopen(url, data)) as fp:
276   File "D:\Users\song\Anaconda3\lib\urllib\request.py", line 223, in urlopen
277     return opener.open(url, data, timeout)
278   File "D:\Users\song\Anaconda3\lib\urllib\request.py", line 526, in open
279     response = self._open(req, data)
280   File "D:\Users\song\Anaconda3\lib\urllib\request.py", line 544, in _open
281     '_open', req)
282   File "D:\Users\song\Anaconda3\lib\urllib\request.py", line 504, in _call_chain
283     result = func(*args)
284   File "D:\Users\song\Anaconda3\lib\urllib\request.py", line 1361, in https_open
285     context=self._context, check_hostname=self._check_hostname)
286   File "D:\Users\song\Anaconda3\lib\urllib\request.py", line 1320, in do_open
287     raise URLError(err)
288 urllib.error.URLError: <urlopen error EOF occurred in violation of protocol (_ssl.c:749)>
289 >>> import requests
290 >>> >>> from requests.adapters import HTTPAdapter
291   File "<stdin>", line 1
292     >>> from requests.adapters import HTTPAdapter
293      ^
294 SyntaxError: invalid syntax
295 >>> >>> from requests.packages.urllib3.poolmanager import PoolManager
296   File "<stdin>", line 1
297     >>> from requests.packages.urllib3.poolmanager import PoolManager
298      ^
299 SyntaxError: invalid syntax
300 >>> >>> import ssl
301   File "<stdin>", line 1
302     >>> import ssl
303      ^
304 SyntaxError: invalid syntax
305 >>> >>>
306   File "<stdin>", line 1
307     >>>
308      ^
309 SyntaxError: invalid syntax
310 >>> >>> class MyAdapter(HTTPAdapter):
311   File "<stdin>", line 1
312     >>> class MyAdapter(HTTPAdapter):
313      ^
314 SyntaxError: invalid syntax
315 >>> ...     def init_poolmanager(self, connections, maxsize, block=False):
316   File "<stdin>", line 1
317     ...     def init_poolmanager(self, connections, maxsize, block=False):
318               ^
319 SyntaxError: invalid syntax
320 >>> ...         self.poolmanager = PoolManager(num_pools=connections,
321   File "<stdin>", line 1
322     ...         self.poolmanager = PoolManager(num_pools=connections,
323                    ^
324 SyntaxError: invalid syntax
325 >>> ...                                        maxsize=maxsize,
326   File "<stdin>", line 1
327     ...                                        maxsize=maxsize,
328                                                      ^
329 SyntaxError: invalid syntax
330 >>> ...                                        block=block,
331   File "<stdin>", line 1
332     ...                                        block=block,
333                                                    ^
334 SyntaxError: invalid syntax
335 >>> ...                                        ssl_version=ssl.PROTOCOL_TLSv1)
336   File "<stdin>", line 1
337     ...                                        ssl_version=ssl.PROTOCOL_TLSv1)
338                                                          ^
339 SyntaxError: invalid syntax
340 >>> ...
341 Ellipsis
342 >>> >>> s = requests.Session()
343   File "<stdin>", line 1
344     >>> s = requests.Session()
345      ^
346 SyntaxError: invalid syntax
347 >>> >>> s.mount('https://', MyAdapter())
348   File "<stdin>", line 1
349     >>> s.mount('https://', MyAdapter())
350      ^
351 SyntaxError: invalid syntax
352 >>> >>> s.get('https://www.supercash.cz')
353   File "<stdin>", line 1
354     >>> s.get('https://www.supercash.cz')
355      ^
356 SyntaxError: invalid syntax
357 >>> <Response [200]>
358   File "<stdin>", line 1
359     <Response [200]>
360     ^
361 SyntaxError: invalid syntax
362 >>> mnist = input_data.read_data_sets(FLAGS.data_dir, one_hot=True)
363 Extracting /tmp/data/train-images-idx3-ubyte.gz
364 Successfully downloaded train-labels-idx1-ubyte.gz 28881 bytes.
365 Extracting /tmp/data/train-labels-idx1-ubyte.gz
366 Successfully downloaded t10k-images-idx3-ubyte.gz 1648877 bytes.
367 Extracting /tmp/data/t10k-images-idx3-ubyte.gz
368 Successfully downloaded t10k-labels-idx1-ubyte.gz 4542 bytes.
369 Extracting /tmp/data/t10k-labels-idx1-ubyte.gz
370 >>> x = tf.placeholder(tf.float32, [None, 784]) # 占位符
371 >>> y = tf.placeholder(tf.float32, [None, 10])
372 >>> W = tf.Variable(tf.zeros([784, 10]))
373 >>> b = tf.Variable(tf.zeros([10]))
374 >>> a = tf.nn.softmax(tf.matmul(x, W) + b)
375 >>> cross_entropy = tf.reduce_mean(-tf.reduce_sum(y * tf.log(a), reduction_indices=[1]))  # 损失函数为交叉熵
376 >>> optimizer = tf.train.GradientDescentOptimizer(0.5) # 梯度下降法,学习速率为0.5
377 >>> train = optimizer.minimize(cross_entropy) # 训练目标:最小化损失函数
378 >>>
379 >>> # Test trained model
380 ... correct_prediction = tf.equal(tf.argmax(a, 1), tf.argmax(y, 1))
381 >>> accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
382 >>> correct_prediction = tf.equal(tf.argmax(a, 1), tf.argmax(y, 1))
383 >>> accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
384 >>> sess = tf.InteractiveSession()      # 建立交互式会话
385 2017-10-29 21:28:03.960497: W C:\tf_jenkins\home\workspace\rel-win\M\windows\PY\36\tensorflow\core\platform\cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could speed up CPU computations.
386 2017-10-29 21:28:03.968465: W C:\tf_jenkins\home\workspace\rel-win\M\windows\PY\36\tensorflow\core\platform\cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX2 instructions, but these are available on your machine and could speed up CPU computations.
387 >>> tf.initialize_all_variables().run()
388 WARNING:tensorflow:From D:\Users\song\Anaconda3\lib\site-packages\tensorflow\python\util\tf_should_use.py:175: initialize_all_variables (from tensorflow.python.ops.variables) is deprecated and will be removed after 2017-03-02.
389 Instructions for updating:
390 Use `tf.global_variables_initializer` instead.
391 >>> for i in range(1000):
392 ...     batch_xs, batch_ys = mnist.train.next_batch(100)
393 ...     train.run({x: batch_xs, y: batch_ys})
394 ... print(sess.run(accuracy,feed_dict={x:mnist.test.images,y:mnist.test.labels}))
395   File "<stdin>", line 4
396     print(sess.run(accuracy,feed_dict={x:mnist.test.images,y:mnist.test.labels}))
397         ^
398 SyntaxError: invalid syntax
399 >>> tf.initialize_all_variables().run()
400 WARNING:tensorflow:From D:\Users\song\Anaconda3\lib\site-packages\tensorflow\python\util\tf_should_use.py:175: initialize_all_variables (from tensorflow.python.ops.variables) is deprecated and will be removed after 2017-03-02.
401 Instructions for updating:
402 Use `tf.global_variables_initializer` instead.
403 >>> tf.global_variables_initializer().run()
404 >>> for i in range(1000):
405 ...     batch_xs, batch_ys = mnist.train.next_batch(100)
406 ...     train.run({x: batch_xs, y: batch_ys})
407 ...
408 >>> print(sess.run(accuracy,feed_dict={x:mnist.test.images,y:mnist.test.labels}))
409 0.9154
410 >>>
View Code

 

----------2018.02.10 另外一台ubuntu服务器,tensorflow已装好,python能import--------------

查看Tensorflow版本:

python -c 'import tensorflow as tf; print(tf.__version__)'

或:

>>> tf.__version__
>>> tf.__path__

使用TensorFlow-Slim App:

github上models/research/slim目录下的README有详细说明。

python download_and_convert_data.py --dataset_name=flowers --dataset_dir="/home/.../data/"

等一会儿就下载好了,ls /home/.../data可看到:

新建creatingTFSlimDataset.py写入:

 1 import tensorflow as tf
 2 from datasets import flowers
 3 
 4 slim = tf.contrib.slim
 5 
 6 # Selects the 'validation' dataset.
 7 dataset = flowers.get_split('validation', "/home/.../data/")
 8 
 9 # Creates a TF-Slim DataProvider which reads the dataset in the background
10 # during both training and testing.
11 provider = slim.dataset_data_provider.DatasetDataProvider(dataset)
12 [image, label] = provider.get(['image', 'label'])

在slim目录下python creatingTFSlimDataset.py

----------2018.02.11 macbook pro-------------- 

mac os安装参考官网和教程,每次打开命令行,进入tf目录执行

source bin/activate

进入;执行 deactivate退出tensorflow环境。

----------2018.02.11 用自己的数据集训练cnn----------

代码完全是来自这里,写一个python脚本,把作者的代码贴进去,文件前后分别加上:

import os
import numpy as np
import tensorflow as tf

if __name__ == '__main__':
    run_training()

这里要么把run_trainning()里的inputData、model等删掉,要么拆成几个.py分别import。

在run_training()里指向自己的数据,作者代码支持2分类,建两个目录,名字分别为0,1,下边直接放图像数据,不能放其他东西。

目录下python xx.py运行,总是报莫名其妙的错误,如下:

2018-02-11 16:14:27.688087: W tensorflow/core/framework/op_kernel.cc:1188] Unimplemented: Cast float to string is not supported
2018-02-11 16:14:27.691410: W tensorflow/core/framework/op_kernel.cc:1188] Unimplemented: Cast float to string is not supported
2018-02-11 16:14:27.700514: W tensorflow/core/framework/op_kernel.cc:1188] Unimplemented: Cast float to string is not supported
2018-02-11 16:14:27.700547: E tensorflow/core/common_runtime/executor.cc:651] Executor failed to create kernel. Unimplemented: Cast float to string is not supported
     [[Node: Cast = Cast[DstT=DT_STRING, SrcT=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:CPU:0"](Cast/x)]]
tensorflow.python.framework.errors_impl.InvalidArgumentError: Expected image (JPEG, PNG, or GIF), got unknown format starting with '\000\000\000\001Bud1\000\000(\000\000\000\010\000'

参考这里解决,主要是mac os每个目录下会生成一个.DS_Store隐藏文件,并且每个目录都有,然后get_files里for  in os.listdir(filename+train_class)会把这个不是目录的文件也读进来,用命令:

ls -d .*     //显示目录下的隐藏文件
rm .DS_Store //删除

暴力删除所有DS_Store,就可以运行了:

   1 (tf) ...$ python selfDataTest.py
   2 2018-02-11 16:29:32.737708: I tensorflow/core/platform/cpu_feature_guard.cc:137] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.2 AVX AVX2 FMA
   3 0
   4 loss:0.693223297596 accuracy:0.5
   5 1
   6 loss:0.693074345589 accuracy:0.5
   7 2
   8 loss:0.697106897831 accuracy:0.25
   9 3
  10 loss:0.693120956421 accuracy:0.5
  11 4
  12 loss:0.693217039108 accuracy:0.5
  13 5
  14 loss:0.693105101585 accuracy:0.5
  15 6
  16 loss:0.696964502335 accuracy:0.25
  17 7
  18 loss:0.689658999443 accuracy:0.75
  19 8
  20 loss:0.689396262169 accuracy:0.75
  21 9
  22 loss:0.689066112041 accuracy:0.75
  23 10
  24 loss:0.688840508461 accuracy:0.75
  25 11
  26 loss:0.693139314651 accuracy:0.5
  27 12
  28 loss:0.683785676956 accuracy:1.0
  29 13
  30 loss:0.703975439072 accuracy:0.0
  31 14
  32 loss:0.68853032589 accuracy:0.75
  33 15
  34 loss:0.698201835155 accuracy:0.25
  35 16
  36 loss:0.68848156929 accuracy:0.75
  37 17
  38 loss:0.698279738426 accuracy:0.25
  39 18
  40 loss:0.693163573742 accuracy:0.5
  41 19
  42 loss:0.6931681633 accuracy:0.5
  43 20
  44 loss:0.683992028236 accuracy:1.0
  45 21
  46 loss:0.693161666393 accuracy:0.5
  47 22
  48 loss:0.698703587055 accuracy:0.25
  49 23
  50 loss:0.693104684353 accuracy:0.5
  51 24
  52 loss:0.68318516016 accuracy:1.0
  53 25
  54 loss:0.699333965778 accuracy:0.25
  55 26
  56 loss:0.693171679974 accuracy:0.5
  57 27
  58 loss:0.687688589096 accuracy:0.75
  59 28
  60 loss:0.699294626713 accuracy:0.25
  61 29
  62 loss:0.698648869991 accuracy:0.25
  63 30
  64 loss:0.697887659073 accuracy:0.25
  65 31
  66 loss:0.697125077248 accuracy:0.25
  67 32
  68 loss:0.693179786205 accuracy:0.5
  69 33
  70 loss:0.690038383007 accuracy:0.75
  71 34
  72 loss:0.693158328533 accuracy:0.5
  73 35
  74 loss:0.693139135838 accuracy:0.5
  75 36
  76 loss:0.693126440048 accuracy:0.5
  77 37
  78 loss:0.6970089674 accuracy:0.25
  79 38
  80 loss:0.693112254143 accuracy:0.5
  81 39
  82 loss:0.696039140224 accuracy:0.25
  83 40
  84 loss:0.691227436066 accuracy:0.75
  85 41
  86 loss:0.6871124506 accuracy:1.0
  87 42
  88 loss:0.698171555996 accuracy:0.25
  89 43
  90 loss:0.693155050278 accuracy:0.5
  91 44
  92 loss:0.693078219891 accuracy:0.5
  93 45
  94 loss:0.700998663902 accuracy:0.0
  95 46
  96 loss:0.69156730175 accuracy:0.75
  97 47
  98 loss:0.693068742752 accuracy:0.5
  99 48
 100 loss:0.69571262598 accuracy:0.25
 101 49
 102 loss:0.694398403168 accuracy:0.25
 103 50
 104 loss:0.692993998528 accuracy:0.75
 105 51
 106 loss:0.69304227829 accuracy:0.75
 107 52
 108 loss:0.691499948502 accuracy:0.75
 109 53
 110 loss:0.699331462383 accuracy:0.0
 111 54
 112 loss:0.693157672882 accuracy:0.25
 113 55
 114 loss:0.693060457706 accuracy:0.5
 115 56
 116 loss:0.694615125656 accuracy:0.25
 117 57
 118 loss:0.693363189697 accuracy:0.25
 119 58
 120 loss:0.693014979362 accuracy:0.5
 121 59
 122 loss:0.694661080837 accuracy:0.25
 123 60
 124 loss:0.693545818329 accuracy:0.25
 125 61
 126 loss:0.693045854568 accuracy:0.5
 127 62
 128 loss:0.693035364151 accuracy:0.5
 129 63
 130 loss:0.694476723671 accuracy:0.25
 131 64
 132 loss:0.692974746227 accuracy:0.5
 133 65
 134 loss:0.692882657051 accuracy:0.5
 135 66
 136 loss:0.693762481213 accuracy:0.25
 137 67
 138 loss:0.690908133984 accuracy:0.75
 139 68
 140 loss:0.702915489674 accuracy:0.0
 141 69
 142 loss:0.692815005779 accuracy:0.5
 143 70
 144 loss:0.694090306759 accuracy:0.25
 145 71
 146 loss:0.690848052502 accuracy:0.75
 147 72
 148 loss:0.698566675186 accuracy:0.25
 149 73
 150 loss:0.689132213593 accuracy:1.0
 151 74
 152 loss:0.675133347511 accuracy:1.0
 153 75
 154 loss:0.693369984627 accuracy:0.5
 155 76
 156 loss:0.709350824356 accuracy:0.25
 157 77
 158 loss:0.693330347538 accuracy:0.5
 159 78
 160 loss:0.704541862011 accuracy:0.25
 161 79
 162 loss:0.706527590752 accuracy:0.0
 163 80
 164 loss:0.692816853523 accuracy:0.5
 165 81
 166 loss:0.694923400879 accuracy:0.25
 167 82
 168 loss:0.689839780331 accuracy:0.75
 169 83
 170 loss:0.70052075386 accuracy:0.25
 171 84
 172 loss:0.692834496498 accuracy:0.5
 173 85
 174 loss:0.692559659481 accuracy:0.5
 175 86
 176 loss:0.695177018642 accuracy:0.25
 177 87
 178 loss:0.697179615498 accuracy:0.25
 179 88
 180 loss:0.697944283485 accuracy:0.0
 181 89
 182 loss:0.682408571243 accuracy:0.75
 183 90
 184 loss:0.676290035248 accuracy:0.75
 185 91
 186 loss:0.646753549576 accuracy:1.0
 187 92
 188 loss:0.657571673393 accuracy:0.75
 189 93
 190 loss:0.74190735817 accuracy:0.25
 191 94
 192 loss:0.729316711426 accuracy:0.25
 193 95
 194 loss:0.668318748474 accuracy:0.75
 195 96
 196 loss:0.726832032204 accuracy:0.25
 197 97
 198 loss:0.692601561546 accuracy:0.5
 199 98
 200 loss:0.692799925804 accuracy:0.5
 201 99
 202 loss:0.671272993088 accuracy:0.75
 203 100
 204 loss:0.663210988045 accuracy:0.75
 205 101
 206 loss:0.69605076313 accuracy:0.5
 207 102
 208 loss:0.733357787132 accuracy:0.25
 209 103
 210 loss:0.717833042145 accuracy:0.25
 211 104
 212 loss:0.691287755966 accuracy:0.5
 213 105
 214 loss:0.691427409649 accuracy:0.5
 215 106
 216 loss:0.704467654228 accuracy:0.25
 217 107
 218 loss:0.689470171928 accuracy:1.0
 219 108
 220 loss:0.704850375652 accuracy:0.25
 221 109
 222 loss:0.68930375576 accuracy:1.0
 223 110
 224 loss:0.690158724785 accuracy:1.0
 225 111
 226 loss:0.678738832474 accuracy:0.75
 227 112
 228 loss:0.690320134163 accuracy:0.5
 229 113
 230 loss:0.685899198055 accuracy:0.5
 231 114
 232 loss:0.666072010994 accuracy:0.75
 233 115
 234 loss:0.69532930851 accuracy:0.5
 235 116
 236 loss:0.67963975668 accuracy:0.5
 237 117
 238 loss:0.664266467094 accuracy:0.5
 239 118
 240 loss:0.637705147266 accuracy:1.0
 241 119
 242 loss:0.639262676239 accuracy:0.75
 243 120
 244 loss:0.549074888229 accuracy:0.75
 245 121
 246 loss:0.950685620308 accuracy:0.0
 247 122
 248 loss:0.649527430534 accuracy:0.75
 249 123
 250 loss:0.687687039375 accuracy:0.75
 251 124
 252 loss:0.236011490226 accuracy:1.0
 253 125
 254 loss:0.360839009285 accuracy:0.75
 255 126
 256 loss:0.160097658634 accuracy:1.0
 257 127
 258 loss:0.0284716840833 accuracy:1.0
 259 128
 260 loss:1.20477676392 accuracy:0.75
 261 129
 262 loss:0.785823404789 accuracy:0.25
 263 130
 264 loss:0.596278190613 accuracy:0.75
 265 131
 266 loss:0.474103331566 accuracy:0.75
 267 132
 268 loss:0.170581206679 accuracy:1.0
 269 133
 270 loss:1.17357873917 accuracy:0.5
 271 134
 272 loss:0.369093000889 accuracy:1.0
 273 135
 274 loss:0.396817922592 accuracy:0.75
 275 136
 276 loss:0.53536605835 accuracy:0.75
 277 137
 278 loss:0.276045441628 accuracy:0.75
 279 138
 280 loss:0.36287856102 accuracy:0.75
 281 139
 282 loss:0.196955054998 accuracy:1.0
 283 140
 284 loss:0.0153374457732 accuracy:1.0
 285 141
 286 loss:0.0416378080845 accuracy:1.0
 287 142
 288 loss:1.89024567604 accuracy:0.5
 289 143
 290 loss:0.512691378593 accuracy:1.0
 291 144
 292 loss:0.0846931710839 accuracy:1.0
 293 145
 294 loss:0.330246806145 accuracy:0.75
 295 146
 296 loss:0.349481493235 accuracy:0.75
 297 147
 298 loss:0.847968816757 accuracy:0.5
 299 148
 300 loss:0.320005506277 accuracy:0.75
 301 149
 302 loss:0.846890568733 accuracy:0.25
 303 150
 304 loss:0.236197531223 accuracy:1.0
 305 151
 306 loss:0.0872330516577 accuracy:1.0
 307 152
 308 loss:0.0432429946959 accuracy:1.0
 309 153
 310 loss:0.0282921157777 accuracy:1.0
 311 154
 312 loss:1.17421019077 accuracy:0.75
 313 155
 314 loss:0.526151418686 accuracy:0.75
 315 156
 316 loss:0.417270839214 accuracy:0.75
 317 157
 318 loss:0.537223100662 accuracy:0.75
 319 158
 320 loss:0.247993305326 accuracy:1.0
 321 159
 322 loss:0.278814792633 accuracy:1.0
 323 160
 324 loss:0.0463420078158 accuracy:1.0
 325 161
 326 loss:0.0170685201883 accuracy:1.0
 327 162
 328 loss:0.223224148154 accuracy:0.75
 329 163
 330 loss:0.0268691331148 accuracy:1.0
 331 164
 332 loss:2.04596710205 accuracy:0.5
 333 165
 334 loss:0.349981129169 accuracy:1.0
 335 166
 336 loss:0.812381505966 accuracy:0.5
 337 167
 338 loss:0.132523924112 accuracy:1.0
 339 168
 340 loss:0.493652850389 accuracy:0.75
 341 169
 342 loss:0.328869134188 accuracy:0.75
 343 170
 344 loss:0.105988666415 accuracy:1.0
 345 171
 346 loss:0.0751493424177 accuracy:1.0
 347 172
 348 loss:0.0750939249992 accuracy:1.0
 349 173
 350 loss:0.304137170315 accuracy:0.75
 351 174
 352 loss:0.273175984621 accuracy:0.75
 353 175
 354 loss:0.543058216572 accuracy:0.75
 355 176
 356 loss:1.90773689747 accuracy:0.5
 357 177
 358 loss:0.438852667809 accuracy:0.75
 359 178
 360 loss:0.442263126373 accuracy:0.75
 361 179
 362 loss:0.429260343313 accuracy:0.75
 363 180
 364 loss:0.245088890195 accuracy:1.0
 365 181
 366 loss:0.159963816404 accuracy:1.0
 367 182
 368 loss:0.039998114109 accuracy:1.0
 369 183
 370 loss:0.105835229158 accuracy:1.0
 371 184
 372 loss:0.00809328071773 accuracy:1.0
 373 185
 374 loss:0.0673048049212 accuracy:1.0
 375 186
 376 loss:1.13752818108 accuracy:0.75
 377 187
 378 loss:0.490282326937 accuracy:0.75
 379 188
 380 loss:1.42135226727 accuracy:0.75
 381 189
 382 loss:0.288748651743 accuracy:1.0
 383 190
 384 loss:0.0984246730804 accuracy:1.0
 385 191
 386 loss:0.123517766595 accuracy:1.0
 387 192
 388 loss:0.0920013636351 accuracy:1.0
 389 193
 390 loss:1.44978451729 accuracy:0.75
 391 194
 392 loss:0.305551946163 accuracy:1.0
 393 195
 394 loss:0.443002015352 accuracy:0.75
 395 196
 396 loss:0.106428675354 accuracy:1.0
 397 197
 398 loss:0.356863230467 accuracy:0.75
 399 198
 400 loss:0.0275120735168 accuracy:1.0
 401 199
 402 loss:1.12723910809 accuracy:0.75
 403 200
 404 loss:0.08886256814 accuracy:1.0
 405 201
 406 loss:0.0773176699877 accuracy:1.0
 407 202
 408 loss:0.17778685689 accuracy:1.0
 409 203
 410 loss:0.263333916664 accuracy:0.75
 411 204
 412 loss:0.100112996995 accuracy:1.0
 413 205
 414 loss:0.0208118930459 accuracy:1.0
 415 206
 416 loss:0.0241779796779 accuracy:1.0
 417 207
 418 loss:0.00176866375841 accuracy:1.0
 419 208
 420 loss:1.03581428528 accuracy:0.75
 421 209
 422 loss:0.101269900799 accuracy:1.0
 423 210
 424 loss:0.522728979588 accuracy:0.75
 425 211
 426 loss:0.0190876871347 accuracy:1.0
 427 212
 428 loss:0.851385474205 accuracy:0.75
 429 213
 430 loss:0.627064526081 accuracy:0.5
 431 214
 432 loss:0.178076297045 accuracy:1.0
 433 215
 434 loss:0.272920429707 accuracy:1.0
 435 216
 436 loss:0.722631931305 accuracy:0.75
 437 217
 438 loss:0.405046164989 accuracy:0.75
 439 218
 440 loss:0.434506893158 accuracy:0.75
 441 219
 442 loss:0.205615088344 accuracy:1.0
 443 220
 444 loss:0.102596238256 accuracy:1.0
 445 221
 446 loss:0.89775633812 accuracy:0.5
 447 222
 448 loss:0.0162407811731 accuracy:1.0
 449 223
 450 loss:0.257048845291 accuracy:0.75
 451 224
 452 loss:0.53179782629 accuracy:0.75
 453 225
 454 loss:0.414461612701 accuracy:0.75
 455 226
 456 loss:0.274204641581 accuracy:0.75
 457 227
 458 loss:0.751442372799 accuracy:0.75
 459 228
 460 loss:0.100349068642 accuracy:1.0
 461 229
 462 loss:0.491792619228 accuracy:0.5
 463 230
 464 loss:0.470929801464 accuracy:1.0
 465 231
 466 loss:0.684968233109 accuracy:0.5
 467 232
 468 loss:0.505018293858 accuracy:0.75
 469 233
 470 loss:0.23813906312 accuracy:1.0
 471 234
 472 loss:1.05322659016 accuracy:0.5
 473 235
 474 loss:0.291554331779 accuracy:1.0
 475 236
 476 loss:0.384746789932 accuracy:1.0
 477 237
 478 loss:0.37275955081 accuracy:0.75
 479 238
 480 loss:0.0688233971596 accuracy:1.0
 481 239
 482 loss:0.718187510967 accuracy:0.75
 483 240
 484 loss:0.609194219112 accuracy:0.75
 485 241
 486 loss:0.225485235453 accuracy:1.0
 487 242
 488 loss:0.283724486828 accuracy:0.75
 489 243
 490 loss:0.563280165195 accuracy:0.75
 491 244
 492 loss:0.0566305555403 accuracy:1.0
 493 245
 494 loss:0.0681798830628 accuracy:1.0
 495 246
 496 loss:0.198830872774 accuracy:1.0
 497 247
 498 loss:0.743586599827 accuracy:0.75
 499 248
 500 loss:0.108701385558 accuracy:1.0
 501 249
 502 loss:0.232169955969 accuracy:1.0
 503 250
 504 loss:0.0204469505697 accuracy:1.0
 505 251
 506 loss:0.0746807381511 accuracy:1.0
 507 252
 508 loss:1.67662298679 accuracy:0.5
 509 253
 510 loss:0.0344735346735 accuracy:1.0
 511 254
 512 loss:0.329333722591 accuracy:0.75
 513 255
 514 loss:0.0228136144578 accuracy:1.0
 515 256
 516 loss:0.558523058891 accuracy:0.75
 517 257
 518 loss:0.801098883152 accuracy:0.75
 519 258
 520 loss:0.294895410538 accuracy:1.0
 521 259
 522 loss:0.073697000742 accuracy:1.0
 523 260
 524 loss:0.0375180691481 accuracy:1.0
 525 261
 526 loss:0.246171832085 accuracy:0.75
 527 262
 528 loss:0.774982333183 accuracy:0.5
 529 263
 530 loss:0.305063486099 accuracy:1.0
 531 264
 532 loss:0.463157624006 accuracy:0.5
 533 265
 534 loss:0.642466902733 accuracy:0.25
 535 266
 536 loss:0.110810160637 accuracy:1.0
 537 267
 538 loss:0.055772036314 accuracy:1.0
 539 268
 540 loss:0.111803464592 accuracy:1.0
 541 269
 542 loss:0.0542620681226 accuracy:1.0
 543 270
 544 loss:0.867859005928 accuracy:0.75
 545 271
 546 loss:0.282488763332 accuracy:1.0
 547 272
 548 loss:0.102671615779 accuracy:1.0
 549 273
 550 loss:0.251693636179 accuracy:0.75
 551 274
 552 loss:0.765829801559 accuracy:0.75
 553 275
 554 loss:0.194914981723 accuracy:1.0
 555 276
 556 loss:0.102006778121 accuracy:1.0
 557 277
 558 loss:0.0539451315999 accuracy:1.0
 559 278
 560 loss:0.0130981495604 accuracy:1.0
 561 279
 562 loss:2.14160680771 accuracy:0.5
 563 280
 564 loss:0.176309481263 accuracy:1.0
 565 281
 566 loss:0.155295550823 accuracy:1.0
 567 282
 568 loss:0.0576198920608 accuracy:1.0
 569 283
 570 loss:0.267267256975 accuracy:1.0
 571 284
 572 loss:0.170527070761 accuracy:1.0
 573 285
 574 loss:0.793471336365 accuracy:0.75
 575 286
 576 loss:0.054802633822 accuracy:1.0
 577 287
 578 loss:0.0160926636308 accuracy:1.0
 579 288
 580 loss:0.113910079002 accuracy:1.0
 581 289
 582 loss:0.0136507945135 accuracy:1.0
 583 290
 584 loss:0.319148600101 accuracy:0.75
 585 291
 586 loss:0.000944297935348 accuracy:1.0
 587 292
 588 loss:0.000640460464638 accuracy:1.0
 589 293
 590 loss:0.00669733900577 accuracy:1.0
 591 294
 592 loss:0.00175015779678 accuracy:1.0
 593 295
 594 loss:0.0475143417716 accuracy:1.0
 595 296
 596 loss:0.00636913161725 accuracy:1.0
 597 297
 598 loss:0.00344254914671 accuracy:1.0
 599 298
 600 loss:0.629906773567 accuracy:0.75
 601 299
 602 loss:0.00485158292577 accuracy:1.0
 603 300
 604 loss:0.117860376835 accuracy:1.0
 605 301
 606 loss:2.2443985939 accuracy:0.75
 607 302
 608 loss:0.00151524401736 accuracy:1.0
 609 303
 610 loss:0.668887317181 accuracy:0.75
 611 304
 612 loss:0.341220498085 accuracy:0.75
 613 305
 614 loss:0.243527442217 accuracy:0.75
 615 306
 616 loss:0.109274975955 accuracy:1.0
 617 307
 618 loss:0.127818629146 accuracy:1.0
 619 308
 620 loss:0.0721819028258 accuracy:1.0
 621 309
 622 loss:0.0184937343001 accuracy:1.0
 623 310
 624 loss:0.820344865322 accuracy:0.5
 625 311
 626 loss:0.0684595555067 accuracy:1.0
 627 312
 628 loss:0.364878892899 accuracy:0.75
 629 313
 630 loss:0.119165182114 accuracy:1.0
 631 314
 632 loss:0.917512893677 accuracy:0.5
 633 315
 634 loss:0.208229511976 accuracy:0.75
 635 316
 636 loss:0.0379144325852 accuracy:1.0
 637 317
 638 loss:0.291262000799 accuracy:0.75
 639 318
 640 loss:1.70546030998 accuracy:0.5
 641 319
 642 loss:0.0183182619512 accuracy:1.0
 643 320
 644 loss:0.382932752371 accuracy:0.75
 645 321
 646 loss:0.163620784879 accuracy:1.0
 647 322
 648 loss:0.319008469582 accuracy:0.75
 649 323
 650 loss:0.088489279151 accuracy:1.0
 651 324
 652 loss:0.715149879456 accuracy:0.5
 653 325
 654 loss:0.0675266161561 accuracy:1.0
 655 326
 656 loss:0.916550815105 accuracy:0.5
 657 327
 658 loss:0.448634713888 accuracy:1.0
 659 328
 660 loss:0.271819204092 accuracy:0.75
 661 329
 662 loss:0.0831155627966 accuracy:1.0
 663 330
 664 loss:0.171018838882 accuracy:1.0
 665 331
 666 loss:0.0210947152227 accuracy:1.0
 667 332
 668 loss:0.331143260002 accuracy:0.75
 669 333
 670 loss:0.50136744976 accuracy:0.75
 671 334
 672 loss:0.156625300646 accuracy:1.0
 673 335
 674 loss:0.0159201174974 accuracy:1.0
 675 336
 676 loss:0.171763345599 accuracy:1.0
 677 337
 678 loss:0.317091315985 accuracy:0.75
 679 338
 680 loss:0.00742457062006 accuracy:1.0
 681 339
 682 loss:0.147552683949 accuracy:1.0
 683 340
 684 loss:0.265565574169 accuracy:0.75
 685 341
 686 loss:0.0794127807021 accuracy:1.0
 687 342
 688 loss:0.90516358614 accuracy:0.75
 689 343
 690 loss:0.0485695488751 accuracy:1.0
 691 344
 692 loss:0.929676651955 accuracy:0.75
 693 345
 694 loss:0.0915883779526 accuracy:1.0
 695 346
 696 loss:0.0149378413334 accuracy:1.0
 697 347
 698 loss:0.0227350518107 accuracy:1.0
 699 348
 700 loss:0.188080132008 accuracy:1.0
 701 349
 702 loss:0.0991646498442 accuracy:1.0
 703 350
 704 loss:0.0718017593026 accuracy:1.0
 705 351
 706 loss:1.19274258614 accuracy:0.5
 707 352
 708 loss:0.965473353863 accuracy:0.5
 709 353
 710 loss:0.259137153625 accuracy:0.75
 711 354
 712 loss:0.0660394281149 accuracy:1.0
 713 355
 714 loss:0.0636159256101 accuracy:1.0
 715 356
 716 loss:0.473960787058 accuracy:0.75
 717 357
 718 loss:0.0584978982806 accuracy:1.0
 719 358
 720 loss:0.225148662925 accuracy:1.0
 721 359
 722 loss:0.551927268505 accuracy:0.75
 723 360
 724 loss:0.129055544734 accuracy:1.0
 725 361
 726 loss:0.135725021362 accuracy:1.0
 727 362
 728 loss:0.05837514624 accuracy:1.0
 729 363
 730 loss:0.050028629601 accuracy:1.0
 731 364
 732 loss:0.0220219194889 accuracy:1.0
 733 365
 734 loss:0.563142418861 accuracy:0.75
 735 366
 736 loss:0.213800609112 accuracy:1.0
 737 367
 738 loss:0.0281376540661 accuracy:1.0
 739 368
 740 loss:1.20224881172 accuracy:0.75
 741 369
 742 loss:0.528139770031 accuracy:0.75
 743 370
 744 loss:0.124928534031 accuracy:1.0
 745 371
 746 loss:0.26053994894 accuracy:0.75
 747 372
 748 loss:0.200136646628 accuracy:0.75
 749 373
 750 loss:0.106237880886 accuracy:1.0
 751 374
 752 loss:0.317531168461 accuracy:1.0
 753 375
 754 loss:0.246357157826 accuracy:0.75
 755 376
 756 loss:0.161189392209 accuracy:1.0
 757 377
 758 loss:0.0400363244116 accuracy:1.0
 759 378
 760 loss:0.000115944123536 accuracy:1.0
 761 379
 762 loss:0.0736970975995 accuracy:1.0
 763 380
 764 loss:2.95828056335 accuracy:0.5
 765 381
 766 loss:0.0402479618788 accuracy:1.0
 767 382
 768 loss:0.27467161417 accuracy:1.0
 769 383
 770 loss:0.0441851988435 accuracy:1.0
 771 384
 772 loss:0.0222014114261 accuracy:1.0
 773 385
 774 loss:0.0845765322447 accuracy:1.0
 775 386
 776 loss:0.21609556675 accuracy:0.75
 777 387
 778 loss:0.305368185043 accuracy:0.75
 779 388
 780 loss:0.457645982504 accuracy:0.75
 781 389
 782 loss:0.479472994804 accuracy:0.75
 783 390
 784 loss:0.163302078843 accuracy:1.0
 785 391
 786 loss:0.436002552509 accuracy:0.75
 787 392
 788 loss:0.128151774406 accuracy:1.0
 789 393
 790 loss:0.258456408978 accuracy:0.75
 791 394
 792 loss:0.22227601707 accuracy:0.75
 793 395
 794 loss:0.0503372251987 accuracy:1.0
 795 396
 796 loss:0.02476574108 accuracy:1.0
 797 397
 798 loss:0.000495057029184 accuracy:1.0
 799 398
 800 loss:0.419431209564 accuracy:0.75
 801 399
 802 loss:0.279945731163 accuracy:1.0
 803 400
 804 loss:0.000864843954332 accuracy:1.0
 805 401
 806 loss:0.0879789367318 accuracy:1.0
 807 402
 808 loss:0.00543585978448 accuracy:1.0
 809 403
 810 loss:0.0035734588746 accuracy:1.0
 811 404
 812 loss:0.00278418860398 accuracy:1.0
 813 405
 814 loss:0.800966143608 accuracy:0.75
 815 406
 816 loss:0.0348575152457 accuracy:1.0
 817 407
 818 loss:0.217690259218 accuracy:0.75
 819 408
 820 loss:0.00130753079429 accuracy:1.0
 821 409
 822 loss:0.00162001827266 accuracy:1.0
 823 410
 824 loss:0.546540558338 accuracy:0.75
 825 411
 826 loss:0.443211138248 accuracy:0.75
 827 412
 828 loss:0.0923056006432 accuracy:1.0
 829 413
 830 loss:0.282079219818 accuracy:0.75
 831 414
 832 loss:0.304762452841 accuracy:0.75
 833 415
 834 loss:0.292380183935 accuracy:0.75
 835 416
 836 loss:0.028173699975 accuracy:1.0
 837 417
 838 loss:0.0553055480123 accuracy:1.0
 839 418
 840 loss:0.388806015253 accuracy:0.75
 841 419
 842 loss:0.256281733513 accuracy:0.75
 843 420
 844 loss:0.00459419749677 accuracy:1.0
 845 421
 846 loss:0.108316868544 accuracy:1.0
 847 422
 848 loss:0.00306999869645 accuracy:1.0
 849 423
 850 loss:0.185824766755 accuracy:0.75
 851 424
 852 loss:0.0356827452779 accuracy:1.0
 853 425
 854 loss:0.0110305007547 accuracy:1.0
 855 426
 856 loss:0.000118359719636 accuracy:1.0
 857 427
 858 loss:0.0264259390533 accuracy:1.0
 859 428
 860 loss:2.09415435791 accuracy:0.5
 861 429
 862 loss:0.405786812305 accuracy:0.5
 863 430
 864 loss:0.170478060842 accuracy:1.0
 865 431
 866 loss:0.153327018023 accuracy:1.0
 867 432
 868 loss:0.0670616924763 accuracy:1.0
 869 433
 870 loss:0.100017897785 accuracy:1.0
 871 434
 872 loss:0.803987801075 accuracy:0.75
 873 435
 874 loss:0.242291912436 accuracy:0.75
 875 436
 876 loss:0.887839794159 accuracy:0.75
 877 437
 878 loss:0.126330152154 accuracy:1.0
 879 438
 880 loss:0.495402723551 accuracy:0.5
 881 439
 882 loss:0.0176431145519 accuracy:1.0
 883 440
 884 loss:0.254504919052 accuracy:1.0
 885 441
 886 loss:0.0066742207855 accuracy:1.0
 887 442
 888 loss:0.103796347976 accuracy:1.0
 889 443
 890 loss:0.0256795622408 accuracy:1.0
 891 444
 892 loss:0.412333756685 accuracy:0.75
 893 445
 894 loss:0.0198563206941 accuracy:1.0
 895 446
 896 loss:0.0271796099842 accuracy:1.0
 897 447
 898 loss:0.00262259342708 accuracy:1.0
 899 448
 900 loss:0.679375708103 accuracy:0.75
 901 449
 902 loss:0.436676889658 accuracy:0.75
 903 450
 904 loss:0.133831515908 accuracy:1.0
 905 451
 906 loss:0.121498912573 accuracy:1.0
 907 452
 908 loss:0.033711925149 accuracy:1.0
 909 453
 910 loss:0.102268278599 accuracy:1.0
 911 454
 912 loss:0.00103223056067 accuracy:1.0
 913 455
 914 loss:0.128242060542 accuracy:1.0
 915 456
 916 loss:0.00504214642569 accuracy:1.0
 917 457
 918 loss:0.00237890915014 accuracy:1.0
 919 458
 920 loss:1.08625376225 accuracy:0.25
 921 459
 922 loss:0.030952764675 accuracy:1.0
 923 460
 924 loss:0.173320218921 accuracy:1.0
 925 461
 926 loss:0.121969670057 accuracy:1.0
 927 462
 928 loss:0.0947612226009 accuracy:1.0
 929 463
 930 loss:0.205078348517 accuracy:0.75
 931 464
 932 loss:0.00106444279663 accuracy:1.0
 933 465
 934 loss:0.34515401721 accuracy:0.75
 935 466
 936 loss:0.15998339653 accuracy:1.0
 937 467
 938 loss:0.00492420978844 accuracy:1.0
 939 468
 940 loss:0.0870720297098 accuracy:1.0
 941 469
 942 loss:2.09969067574 accuracy:0.5
 943 470
 944 loss:0.194903433323 accuracy:1.0
 945 471
 946 loss:0.242374703288 accuracy:1.0
 947 472
 948 loss:0.00174707639962 accuracy:1.0
 949 473
 950 loss:0.0663149431348 accuracy:1.0
 951 474
 952 loss:0.0415232479572 accuracy:1.0
 953 475
 954 loss:0.745410084724 accuracy:0.75
 955 476
 956 loss:0.72058993578 accuracy:0.75
 957 477
 958 loss:0.074091270566 accuracy:1.0
 959 478
 960 loss:0.0825443267822 accuracy:1.0
 961 479
 962 loss:0.0513244643807 accuracy:1.0
 963 480
 964 loss:0.0320774801075 accuracy:1.0
 965 481
 966 loss:0.0128127280623 accuracy:1.0
 967 482
 968 loss:0.0371737554669 accuracy:1.0
 969 483
 970 loss:0.276018559933 accuracy:0.75
 971 484
 972 loss:0.0172993671149 accuracy:1.0
 973 485
 974 loss:0.0301472023129 accuracy:1.0
 975 486
 976 loss:0.00649361917749 accuracy:1.0
 977 487
 978 loss:0.000473263178719 accuracy:1.0
 979 488
 980 loss:0.000434344052337 accuracy:1.0
 981 489
 982 loss:0.0177765209228 accuracy:1.0
 983 490
 984 loss:0.100023776293 accuracy:1.0
 985 491
 986 loss:0.00998072884977 accuracy:1.0
 987 492
 988 loss:0.178784310818 accuracy:0.75
 989 493
 990 loss:0.000287099683192 accuracy:1.0
 991 494
 992 loss:2.17384004593 accuracy:0.75
 993 495
 994 loss:0.125859886408 accuracy:1.0
 995 496
 996 loss:0.0469430424273 accuracy:1.0
 997 497
 998 loss:0.0470446236432 accuracy:1.0
 999 498
1000 loss:0.00149866973516 accuracy:1.0
1001 499
1002 loss:1.76050198078 accuracy:0.5
1003 500
1004 loss:0.223427206278 accuracy:1.0
1005 501
1006 loss:0.252842336893 accuracy:0.75
1007 502
1008 loss:0.688393950462 accuracy:0.75
1009 503
1010 loss:0.0202198959887 accuracy:1.0
1011 504
1012 loss:0.00671406136826 accuracy:1.0
1013 505
1014 loss:0.248940289021 accuracy:1.0
1015 506
1016 loss:0.274929821491 accuracy:0.75
1017 507
1018 loss:0.12192375958 accuracy:1.0
1019 508
1020 loss:0.529097795486 accuracy:0.75
1021 509
1022 loss:0.0117030935362 accuracy:1.0
1023 510
1024 loss:0.0703663975 accuracy:1.0
1025 511
1026 loss:0.00478047179058 accuracy:1.0
1027 512
1028 loss:0.0121546797454 accuracy:1.0
1029 513
1030 loss:0.208536297083 accuracy:1.0
1031 514
1032 loss:0.00334931351244 accuracy:1.0
1033 515
1034 loss:0.79892295599 accuracy:0.75
1035 516
1036 loss:1.14639115334 accuracy:0.75
1037 517
1038 loss:0.0293184090406 accuracy:1.0
1039 518
1040 loss:0.0145129384473 accuracy:1.0
1041 519
1042 loss:0.51245445013 accuracy:0.5
1043 520
1044 loss:0.163923382759 accuracy:1.0
1045 521
1046 loss:0.00152231776156 accuracy:1.0
1047 522
1048 loss:0.00467296224087 accuracy:1.0
1049 523
1050 loss:0.335566133261 accuracy:0.75
1051 524
1052 loss:0.565649867058 accuracy:0.75
1053 525
1054 loss:0.0779503583908 accuracy:1.0
1055 526
1056 loss:0.0503666475415 accuracy:1.0
1057 527
1058 loss:0.0936669185758 accuracy:1.0
1059 528
1060 loss:0.0114694610238 accuracy:1.0
1061 529
1062 loss:0.0113796535879 accuracy:1.0
1063 530
1064 loss:0.00210900465026 accuracy:1.0
1065 531
1066 loss:0.0697501897812 accuracy:1.0
1067 532
1068 loss:0.0413017123938 accuracy:1.0
1069 533
1070 loss:0.000223232258577 accuracy:1.0
1071 534
1072 loss:0.00237680179998 accuracy:1.0
1073 535
1074 loss:0.0935806557536 accuracy:1.0
1075 536
1076 loss:0.105601318181 accuracy:1.0
1077 537
1078 loss:2.22019316425e-05 accuracy:1.0
1079 538
1080 loss:0.604238510132 accuracy:0.75
1081 539
1082 loss:0.0422407202423 accuracy:1.0
1083 540
1084 loss:0.0232363473624 accuracy:1.0
1085 541
1086 loss:0.0315810516477 accuracy:1.0
1087 542
1088 loss:3.51061898982e-05 accuracy:1.0
1089 543
1090 loss:0.0173356998712 accuracy:1.0
1091 544
1092 loss:0.00834203884006 accuracy:1.0
1093 545
1094 loss:0.000342688814271 accuracy:1.0
1095 546
1096 loss:7.11309767212e-05 accuracy:1.0
1097 547
1098 loss:0.00906061194837 accuracy:1.0
1099 548
1100 loss:1.66892471043e-06 accuracy:1.0
1101 549
1102 loss:0.00172243604902 accuracy:1.0
1103 550
1104 loss:0.034824796021 accuracy:1.0
1105 551
1106 loss:1.22189294416e-06 accuracy:1.0
1107 552
1108 loss:0.00228166719899 accuracy:1.0
1109 553
1110 loss:1.75538408756 accuracy:0.75
1111 554
1112 loss:0.160510271788 accuracy:1.0
1113 555
1114 loss:0.00583411566913 accuracy:1.0
1115 556
1116 loss:0.0328364670277 accuracy:1.0
1117 557
1118 loss:0.865779876709 accuracy:0.75
1119 558
1120 loss:0.643167614937 accuracy:0.5
1121 559
1122 loss:2.28500294685 accuracy:0.0
1123 560
1124 loss:0.0093042999506 accuracy:1.0
1125 561
1126 loss:0.735183119774 accuracy:0.75
1127 562
1128 loss:0.0769147053361 accuracy:1.0
1129 563
1130 loss:0.0310892332345 accuracy:1.0
1131 564
1132 loss:0.0728826448321 accuracy:1.0
1133 565
1134 loss:0.178516685963 accuracy:1.0
1135 566
1136 loss:0.0103313624859 accuracy:1.0
1137 567
1138 loss:0.118710055947 accuracy:1.0
1139 568
1140 loss:0.074576176703 accuracy:1.0
1141 569
1142 loss:0.240194231272 accuracy:0.75
1143 570
1144 loss:0.0038958825171 accuracy:1.0
1145 571
1146 loss:0.000401474506361 accuracy:1.0
1147 572
1148 loss:0.0813326686621 accuracy:1.0
1149 573
1150 loss:0.0319667756557 accuracy:1.0
1151 574
1152 loss:0.0254385173321 accuracy:1.0
1153 575
1154 loss:0.00608881236985 accuracy:1.0
1155 576
1156 loss:0.0615266412497 accuracy:1.0
1157 577
1158 loss:0.00878894422203 accuracy:1.0
1159 578
1160 loss:0.00919084344059 accuracy:1.0
1161 579
1162 loss:0.0137438997626 accuracy:1.0
1163 580
1164 loss:5.85580492043e-05 accuracy:1.0
1165 581
1166 loss:0.950065612793 accuracy:0.75
1167 582
1168 loss:0.517662346363 accuracy:0.75
1169 583
1170 loss:0.0079373139888 accuracy:1.0
1171 584
1172 loss:0.199831828475 accuracy:0.75
1173 585
1174 loss:0.0586840547621 accuracy:1.0
1175 586
1176 loss:0.0635885223746 accuracy:1.0
1177 587
1178 loss:0.00248917890713 accuracy:1.0
1179 588
1180 loss:0.0176570080221 accuracy:1.0
1181 589
1182 loss:0.00802893098444 accuracy:1.0
1183 590
1184 loss:0.00644389400259 accuracy:1.0
1185 591
1186 loss:0.000337625970133 accuracy:1.0
1187 592
1188 loss:0.000656736374367 accuracy:1.0
1189 593
1190 loss:0.0069315279834 accuracy:1.0
1191 594
1192 loss:0.000192244129721 accuracy:1.0
1193 595
1194 loss:0.153810724616 accuracy:1.0
1195 596
1196 loss:0.509512066841 accuracy:0.75
1197 597
1198 loss:2.8454875946 accuracy:0.5
1199 598
1200 loss:0.121696084738 accuracy:1.0
1201 599
1202 loss:0.13493694365 accuracy:1.0
1203 600
1204 loss:0.0113169485703 accuracy:1.0
1205 601
1206 loss:0.143897026777 accuracy:1.0
1207 602
1208 loss:0.0995514839888 accuracy:1.0
1209 603
1210 loss:0.00416302261874 accuracy:1.0
1211 604
1212 loss:0.0498762577772 accuracy:1.0
1213 605
1214 loss:0.000733904773369 accuracy:1.0
1215 606
1216 loss:0.00432188156992 accuracy:1.0
1217 607
1218 loss:0.247714474797 accuracy:0.75
1219 608
1220 loss:0.0603492446244 accuracy:1.0
1221 609
1222 loss:0.00636652298272 accuracy:1.0
1223 610
1224 loss:3.8743002051e-07 accuracy:1.0
1225 611
1226 loss:0.000434571033111 accuracy:1.0
1227 612
1228 loss:0.000185367985978 accuracy:1.0
1229 613
1230 loss:1.27765703201 accuracy:0.5
1231 614
1232 loss:0.0464809089899 accuracy:1.0
1233 615
1234 loss:0.0682013481855 accuracy:1.0
1235 616
1236 loss:0.166923344135 accuracy:1.0
1237 617
1238 loss:0.00747666787356 accuracy:1.0
1239 618
1240 loss:0.000737957539968 accuracy:1.0
1241 619
1242 loss:0.147793710232 accuracy:1.0
1243 620
1244 loss:0.00622826628387 accuracy:1.0
1245 621
1246 loss:0.0026685774792 accuracy:1.0
1247 622
1248 loss:0.0266832802445 accuracy:1.0
1249 623
1250 loss:0.00111918640323 accuracy:1.0
1251 624
1252 loss:0.166999429464 accuracy:1.0
1253 625
1254 loss:0.00493326690048 accuracy:1.0
1255 626
1256 loss:0.148973792791 accuracy:1.0
1257 627
1258 loss:0.0164778511971 accuracy:1.0
1259 628
1260 loss:0.0263445004821 accuracy:1.0
1261 629
1262 loss:0.000971373054199 accuracy:1.0
1263 630
1264 loss:0.137379467487 accuracy:1.0
1265 631
1266 loss:0.000336995668476 accuracy:1.0
1267 632
1268 loss:0.000118585114251 accuracy:1.0
1269 633
1270 loss:0.194744035602 accuracy:0.75
1271 634
1272 loss:0.622318923473 accuracy:0.75
1273 635
1274 loss:0.0158670805395 accuracy:1.0
1275 636
1276 loss:0.00111870421097 accuracy:1.0
1277 637
1278 loss:0.00360449962318 accuracy:1.0
1279 638
1280 loss:0.123612225056 accuracy:1.0
1281 639
1282 loss:0.915646851063 accuracy:0.75
1283 640
1284 loss:0.00414372095838 accuracy:1.0
1285 641
1286 loss:0.00148615182843 accuracy:1.0
1287 642
1288 loss:0.139044344425 accuracy:1.0
1289 643
1290 loss:0.000594415760133 accuracy:1.0
1291 644
1292 loss:0.0548767484725 accuracy:1.0
1293 645
1294 loss:0.131095871329 accuracy:1.0
1295 646
1296 loss:0.0180732347071 accuracy:1.0
1297 647
1298 loss:0.0192443877459 accuracy:1.0
1299 648
1300 loss:0.002840944333 accuracy:1.0
1301 649
1302 loss:0.0817834958434 accuracy:1.0
1303 650
1304 loss:7.39887691452e-05 accuracy:1.0
1305 651
1306 loss:0.000455870700534 accuracy:1.0
1307 652
1308 loss:0.00230005686171 accuracy:1.0
1309 653
1310 loss:0.000108704443846 accuracy:1.0
1311 654
1312 loss:0.0797890126705 accuracy:1.0
1313 655
1314 loss:0.00503324298188 accuracy:1.0
1315 656
1316 loss:0.0720994323492 accuracy:1.0
1317 657
1318 loss:2.68220759381e-07 accuracy:1.0
1319 658
1320 loss:0.00216671684757 accuracy:1.0
1321 659
1322 loss:0.000182132818736 accuracy:1.0
1323 660
1324 loss:0.0201711002737 accuracy:1.0
1325 661
1326 loss:0.000373564369511 accuracy:1.0
1327 662
1328 loss:0.210333183408 accuracy:0.75
1329 663
1330 loss:0.434348583221 accuracy:0.75
1331 664
1332 loss:0.00160946941469 accuracy:1.0
1333 665
1334 loss:0.00168058788404 accuracy:1.0
1335 666
1336 loss:0.0156195694581 accuracy:1.0
1337 667
1338 loss:0.0179282538593 accuracy:1.0
1339 668
1340 loss:0.619975090027 accuracy:0.75
1341 669
1342 loss:0.0250529292971 accuracy:1.0
1343 670
1344 loss:1.1728490591 accuracy:0.75
1345 671
1346 loss:0.000638192694169 accuracy:1.0
1347 672
1348 loss:0.00298879598267 accuracy:1.0
1349 673
1350 loss:0.000486818666104 accuracy:1.0
1351 674
1352 loss:7.75988737587e-05 accuracy:1.0
1353 675
1354 loss:0.0147338835523 accuracy:1.0
1355 676
1356 loss:0.0939862802625 accuracy:1.0
1357 677
1358 loss:0.356457710266 accuracy:0.75
1359 678
1360 loss:0.0 accuracy:1.0
1361 679
1362 loss:0.00246878503822 accuracy:1.0
1363 680
1364 loss:0.00888949073851 accuracy:1.0
1365 681
1366 loss:0.170242756605 accuracy:1.0
1367 682
1368 loss:0.0192358512431 accuracy:1.0
1369 683
1370 loss:0.228889971972 accuracy:1.0
1371 684
1372 loss:0.0224611312151 accuracy:1.0
1373 685
1374 loss:0.645591199398 accuracy:0.75
1375 686
1376 loss:0.0435347706079 accuracy:1.0
1377 687
1378 loss:0.016888409853 accuracy:1.0
1379 688
1380 loss:0.000532899983227 accuracy:1.0
1381 689
1382 loss:0.0515907779336 accuracy:1.0
1383 690
1384 loss:0.0468752644956 accuracy:1.0
1385 691
1386 loss:7.48243910493e-05 accuracy:1.0
1387 692
1388 loss:0.000171366787981 accuracy:1.0
1389 693
1390 loss:0.134263500571 accuracy:1.0
1391 694
1392 loss:0.0109058497474 accuracy:1.0
1393 695
1394 loss:0.0117134619504 accuracy:1.0
1395 696
1396 loss:0.000636452401523 accuracy:1.0
1397 697
1398 loss:0.074299864471 accuracy:1.0
1399 698
1400 loss:0.0229991711676 accuracy:1.0
1401 699
1402 loss:1.19209028071e-06 accuracy:1.0
1403 700
1404 loss:5.01820904901e-05 accuracy:1.0
1405 701
1406 loss:0.000727334117983 accuracy:1.0
1407 702
1408 loss:6.25842085356e-06 accuracy:1.0
1409 703
1410 loss:0.00155220844317 accuracy:1.0
1411 704
1412 loss:2.98023206113e-08 accuracy:1.0
1413 705
1414 loss:0.0246029030532 accuracy:1.0
1415 706
1416 loss:0.00987655483186 accuracy:1.0
1417 707
1418 loss:0.00130767049268 accuracy:1.0
1419 708
1420 loss:8.22408910608e-05 accuracy:1.0
1421 709
1422 loss:1.19209261129e-07 accuracy:1.0
1423 710
1424 loss:8.04661510756e-07 accuracy:1.0
1425 711
1426 loss:0.00843916833401 accuracy:1.0
1427 712
1428 loss:0.710033893585 accuracy:0.75
1429 713
1430 loss:3.87427189708e-06 accuracy:1.0
1431 714
1432 loss:0.00279647787102 accuracy:1.0
1433 715
1434 loss:1.66902518272 accuracy:0.75
1435 716
1436 loss:4.8334699386e-05 accuracy:1.0
1437 717
1438 loss:0.000108164756966 accuracy:1.0
1439 718
1440 loss:0.000460023642518 accuracy:1.0
1441 719
1442 loss:0.014947255142 accuracy:1.0
1443 720
1444 loss:0.00103411846794 accuracy:1.0
1445 721
1446 loss:0.00142774963751 accuracy:1.0
1447 722
1448 loss:0.00128786324058 accuracy:1.0
1449 723
1450 loss:0.000406698818551 accuracy:1.0
1451 724
1452 loss:0.0186016894877 accuracy:1.0
1453 725
1454 loss:3.92168876715e-05 accuracy:1.0
1455 726
1456 loss:0.00641623232514 accuracy:1.0
1457 727
1458 loss:0.000120202676044 accuracy:1.0
1459 728
1460 loss:5.96040854361e-06 accuracy:1.0
1461 729
1462 loss:0.0003364807053 accuracy:1.0
1463 730
1464 loss:7.39286333555e-05 accuracy:1.0
1465 731
1466 loss:0.0154847707599 accuracy:1.0
1467 732
1468 loss:0.000457124668173 accuracy:1.0
1469 733
1470 loss:3.1618270441e-05 accuracy:1.0
1471 734
1472 loss:0.910873115063 accuracy:0.75
1473 735
1474 loss:0.0267107822001 accuracy:1.0
1475 736
1476 loss:0.39768460393 accuracy:0.75
1477 737
1478 loss:0.0669838786125 accuracy:1.0
1479 738
1480 loss:0.00644081551582 accuracy:1.0
1481 739
1482 loss:0.034500323236 accuracy:1.0
1483 740
1484 loss:0.00022785415058 accuracy:1.0
1485 741
1486 loss:0.0 accuracy:1.0
1487 742
1488 loss:0.000210024169064 accuracy:1.0
1489 743
1490 loss:0.00147695967462 accuracy:1.0
1491 744
1492 loss:0.0725145637989 accuracy:1.0
1493 745
1494 loss:0.029834413901 accuracy:1.0
1495 746
1496 loss:0.0220102537423 accuracy:1.0
1497 747
1498 loss:7.51805127948e-05 accuracy:1.0
1499 748
1500 loss:0.243395596743 accuracy:1.0
1501 749
1502 loss:0.0 accuracy:1.0
1503 750
1504 loss:0.000294363766443 accuracy:1.0
1505 751
1506 loss:0.000681870267726 accuracy:1.0
1507 752
1508 loss:0.0001109384757 accuracy:1.0
1509 753
1510 loss:0.00326775875874 accuracy:1.0
1511 754
1512 loss:0.000125748862047 accuracy:1.0
1513 755
1514 loss:0.0223309192806 accuracy:1.0
1515 756
1516 loss:2.08616171449e-07 accuracy:1.0
1517 757
1518 loss:0.00221180100925 accuracy:1.0
1519 758
1520 loss:0.000174014785443 accuracy:1.0
1521 759
1522 loss:1.72069203854 accuracy:0.5
1523 760
1524 loss:0.0 accuracy:1.0
1525 761
1526 loss:2.15550684929 accuracy:0.75
1527 762
1528 loss:0.518400728703 accuracy:0.75
1529 763
1530 loss:0.105010151863 accuracy:1.0
1531 764
1532 loss:0.00476733082905 accuracy:1.0
1533 765
1534 loss:0.618326127529 accuracy:0.75
1535 766
1536 loss:0.0568829216063 accuracy:1.0
1537 767
1538 loss:0.00282724341378 accuracy:1.0
1539 768
1540 loss:0.00655440147966 accuracy:1.0
1541 769
1542 loss:0.0208293218166 accuracy:1.0
1543 770
1544 loss:0.0136799290776 accuracy:1.0
1545 771
1546 loss:0.069712460041 accuracy:1.0
1547 772
1548 loss:0.000210100814002 accuracy:1.0
1549 773
1550 loss:0.00459663849324 accuracy:1.0
1551 774
1552 loss:0.000156323905685 accuracy:1.0
1553 775
1554 loss:0.00276682106778 accuracy:1.0
1555 776
1556 loss:0.000317671743687 accuracy:1.0
1557 777
1558 loss:0.00603116257116 accuracy:1.0
1559 778
1560 loss:0.000105627936136 accuracy:1.0
1561 779
1562 loss:0.0012082036119 accuracy:1.0
1563 780
1564 loss:0.00771681312472 accuracy:1.0
1565 781
1566 loss:0.000266665418167 accuracy:1.0
1567 782
1568 loss:0.000127759703901 accuracy:1.0
1569 783
1570 loss:0.0755270496011 accuracy:1.0
1571 784
1572 loss:0.16779884696 accuracy:1.0
1573 785
1574 loss:0.00140535703395 accuracy:1.0
1575 786
1576 loss:0.00015215415624 accuracy:1.0
1577 787
1578 loss:0.000368304987205 accuracy:1.0
1579 788
1580 loss:0.00157043302897 accuracy:1.0
1581 789
1582 loss:2.41397765421e-06 accuracy:1.0
1583 790
1584 loss:2.59279295278e-06 accuracy:1.0
1585 791
1586 loss:6.28824091109e-06 accuracy:1.0
1587 792
1588 loss:0.0221433527768 accuracy:1.0
1589 793
1590 loss:2.08616171449e-07 accuracy:1.0
1591 794
1592 loss:1.57948488777e-05 accuracy:1.0
1593 795
1594 loss:0.143929585814 accuracy:1.0
1595 796
1596 loss:1.9745169878 accuracy:0.75
1597 797
1598 loss:0.000270577816991 accuracy:1.0
1599 798
1600 loss:5.12140083313 accuracy:0.5
1601 799
1602 loss:0.00173324288335 accuracy:1.0
1603 800
1604 loss:1.19792962074 accuracy:0.75
1605 801
1606 loss:2.02655382964e-06 accuracy:1.0
1607 802
1608 loss:0.00112404092215 accuracy:1.0
1609 803
1610 loss:0.00845727138221 accuracy:1.0
1611 804
1612 loss:0.28252235055 accuracy:0.75
1613 805
1614 loss:0.0106558371335 accuracy:1.0
1615 806
1616 loss:0.014702941291 accuracy:1.0
1617 807
1618 loss:0.00130192749202 accuracy:1.0
1619 808
1620 loss:0.000347329594661 accuracy:1.0
1621 809
1622 loss:0.00703197019175 accuracy:1.0
1623 810
1624 loss:0.0106599340215 accuracy:1.0
1625 811
1626 loss:0.00392346037552 accuracy:1.0
1627 812
1628 loss:0.0465068034828 accuracy:1.0
1629 813
1630 loss:8.81706946529e-05 accuracy:1.0
1631 814
1632 loss:0.0458577163517 accuracy:1.0
1633 815
1634 loss:0.0467248111963 accuracy:1.0
1635 816
1636 loss:0.00091094506206 accuracy:1.0
1637 817
1638 loss:3.09035203827e-05 accuracy:1.0
1639 818
1640 loss:0.00139724835753 accuracy:1.0
1641 819
1642 loss:0.000333012023475 accuracy:1.0
1643 820
1644 loss:0.00710536446422 accuracy:1.0
1645 821
1646 loss:0.00560972420499 accuracy:1.0
1647 822
1648 loss:0.000250497396337 accuracy:1.0
1649 823
1650 loss:0.00380676612258 accuracy:1.0
1651 824
1652 loss:0.000113908354251 accuracy:1.0
1653 825
1654 loss:0.0343874841928 accuracy:1.0
1655 826
1656 loss:0.00282790628262 accuracy:1.0
1657 827
1658 loss:6.72862443025e-05 accuracy:1.0
1659 828
1660 loss:0.00147931976244 accuracy:1.0
1661 829
1662 loss:2.86683352897e-05 accuracy:1.0
1663 830
1664 loss:0.0245764218271 accuracy:1.0
1665 831
1666 loss:4.20210108132e-06 accuracy:1.0
1667 832
1668 loss:7.25034042262e-05 accuracy:1.0
1669 833
1670 loss:3.16184814437e-05 accuracy:1.0
1671 834
1672 loss:1.79704529728e-05 accuracy:1.0
1673 835
1674 loss:0.000492809806019 accuracy:1.0
1675 836
1676 loss:0.01179948356 accuracy:1.0
1677 837
1678 loss:0.0204155929387 accuracy:1.0
1679 838
1680 loss:0.0225518066436 accuracy:1.0
1681 839
1682 loss:0.000979462754913 accuracy:1.0
1683 840
1684 loss:0.0264807604253 accuracy:1.0
1685 841
1686 loss:0.000642011058517 accuracy:1.0
1687 842
1688 loss:9.98367613647e-06 accuracy:1.0
1689 843
1690 loss:0.000578560575377 accuracy:1.0
1691 844
1692 loss:0.00985639821738 accuracy:1.0
1693 845
1694 loss:0.000689651060384 accuracy:1.0
1695 846
1696 loss:0.264054358006 accuracy:0.75
1697 847
1698 loss:0.00211702845991 accuracy:1.0
1699 848
1700 loss:0.0 accuracy:1.0
1701 849
1702 loss:0.00793411303312 accuracy:1.0
1703 850
1704 loss:0.0133068496361 accuracy:1.0
1705 851
1706 loss:0.00019709445769 accuracy:1.0
1707 852
1708 loss:0.000726983242203 accuracy:1.0
1709 853
1710 loss:4.52993208455e-06 accuracy:1.0
1711 854
1712 loss:0.000351926224539 accuracy:1.0
1713 855
1714 loss:2.68220759381e-07 accuracy:1.0
1715 856
1716 loss:2.38418522258e-07 accuracy:1.0
1717 857
1718 loss:0.000103779944766 accuracy:1.0
1719 858
1720 loss:0.00327592715621 accuracy:1.0
1721 859
1722 loss:0.000106491184852 accuracy:1.0
1723 860
1724 loss:5.18235428899e-05 accuracy:1.0
1725 861
1726 loss:2.75658640021e-05 accuracy:1.0
1727 862
1728 loss:6.54671530356e-05 accuracy:1.0
1729 863
1730 loss:0.0223422013223 accuracy:1.0
1731 864
1732 loss:0.000272205128567 accuracy:1.0
1733 865
1734 loss:2.03243580472e-05 accuracy:1.0
1735 866
1736 loss:2.66121260211e-05 accuracy:1.0
1737 867
1738 loss:1.37090319186e-06 accuracy:1.0
1739 868
1740 loss:0.0945804268122 accuracy:1.0
1741 869
1742 loss:3.57627612857e-07 accuracy:1.0
1743 870
1744 loss:0.0 accuracy:1.0
1745 871
1746 loss:4.93477746204e-05 accuracy:1.0
1747 872
1748 loss:4.08288497056e-06 accuracy:1.0
1749 873
1750 loss:0.0 accuracy:1.0
1751 874
1752 loss:0.000123173100292 accuracy:1.0
1753 875
1754 loss:0.000122963945614 accuracy:1.0
1755 876
1756 loss:1.90734135685e-06 accuracy:1.0
1757 877
1758 loss:0.0 accuracy:1.0
1759 878
1760 loss:2.5754442215 accuracy:0.75
1761 879
1762 loss:7.56566732889e-05 accuracy:1.0
1763 880
1764 loss:0.0012678487692 accuracy:1.0
1765 881
1766 loss:0.000307852867991 accuracy:1.0
1767 882
1768 loss:1.25169458443e-06 accuracy:1.0
1769 883
1770 loss:0.00642994698137 accuracy:1.0
1771 884
1772 loss:0.000592304742895 accuracy:1.0
1773 885
1774 loss:0.000150688283611 accuracy:1.0
1775 886
1776 loss:0.00178344349843 accuracy:1.0
1777 887
1778 loss:0.259098112583 accuracy:0.75
1779 888
1780 loss:0.000131561449962 accuracy:1.0
1781 889
1782 loss:3.65051782865e-05 accuracy:1.0
1783 890
1784 loss:0.00213865935802 accuracy:1.0
1785 891
1786 loss:6.25848144864e-07 accuracy:1.0
1787 892
1788 loss:4.12733934354e-05 accuracy:1.0
1789 893
1790 loss:5.81375788897e-05 accuracy:1.0
1791 894
1792 loss:0.000273287500022 accuracy:1.0
1793 895
1794 loss:0.000152824548422 accuracy:1.0
1795 896
1796 loss:8.31473425933e-06 accuracy:1.0
1797 897
1798 loss:2.2857580916e-05 accuracy:1.0
1799 898
1800 loss:7.83792074799e-06 accuracy:1.0
1801 899
1802 loss:0.0115786949173 accuracy:1.0
1803 900
1804 loss:9.68728563748e-05 accuracy:1.0
1805 901
1806 loss:1.13248688649e-06 accuracy:1.0
1807 902
1808 loss:0.0496145673096 accuracy:1.0
1809 903
1810 loss:1.11756226033e-05 accuracy:1.0
1811 904
1812 loss:0.000203325893381 accuracy:1.0
1813 905
1814 loss:4.82794484924e-06 accuracy:1.0
1815 906
1816 loss:1.26060649563e-05 accuracy:1.0
1817 907
1818 loss:0.000413879693951 accuracy:1.0
1819 908
1820 loss:6.35007745586e-05 accuracy:1.0
1821 909
1822 loss:2.98023206113e-08 accuracy:1.0
1823 910
1824 loss:0.00323041388765 accuracy:1.0
1825 911
1826 loss:9.79712203844e-05 accuracy:1.0
1827 912
1828 loss:1.62848079205 accuracy:0.75
1829 913
1830 loss:0.0242614541203 accuracy:1.0
1831 914
1832 loss:0.0270170196891 accuracy:1.0
1833 915
1834 loss:0.298094958067 accuracy:0.75
1835 916
1836 loss:0.000197779576411 accuracy:1.0
1837 917
1838 loss:0.0 accuracy:1.0
1839 918
1840 loss:1.09183096886 accuracy:0.75
1841 919
1842 loss:0.059089269489 accuracy:1.0
1843 920
1844 loss:0.540388822556 accuracy:0.75
1845 921
1846 loss:0.00234878063202 accuracy:1.0
1847 922
1848 loss:0.00449673319235 accuracy:1.0
1849 923
1850 loss:0.00438120402396 accuracy:1.0
1851 924
1852 loss:0.000105101295048 accuracy:1.0
1853 925
1854 loss:0.00597113603726 accuracy:1.0
1855 926
1856 loss:0.000764504424296 accuracy:1.0
1857 927
1858 loss:0.000180775154149 accuracy:1.0
1859 928
1860 loss:0.000158698647283 accuracy:1.0
1861 929
1862 loss:3.12923202728e-06 accuracy:1.0
1863 930
1864 loss:0.0212600249797 accuracy:1.0
1865 931
1866 loss:0.00659835385159 accuracy:1.0
1867 932
1868 loss:0.181482896209 accuracy:0.75
1869 933
1870 loss:0.00233455142006 accuracy:1.0
1871 934
1872 loss:0.77560710907 accuracy:0.75
1873 935
1874 loss:2.85493988486e-05 accuracy:1.0
1875 936
1876 loss:1.73443495441e-05 accuracy:1.0
1877 937
1878 loss:1.39770290843e-05 accuracy:1.0
1879 938
1880 loss:0.0147224320099 accuracy:1.0
1881 939
1882 loss:0.0225931722671 accuracy:1.0
1883 940
1884 loss:0.000159510731464 accuracy:1.0
1885 941
1886 loss:0.0267392788082 accuracy:1.0
1887 942
1888 loss:1.6301364667e-05 accuracy:1.0
1889 943
1890 loss:0.00837118458003 accuracy:1.0
1891 944
1892 loss:1.40070710586e-06 accuracy:1.0
1893 945
1894 loss:1.78813877483e-07 accuracy:1.0
1895 946
1896 loss:1.35598693305e-05 accuracy:1.0
1897 947
1898 loss:3.6832956539e-05 accuracy:1.0
1899 948
1900 loss:2.1457603907e-06 accuracy:1.0
1901 949
1902 loss:6.25848201707e-07 accuracy:1.0
1903 950
1904 loss:6.67563244861e-06 accuracy:1.0
1905 951
1906 loss:0.000172200932866 accuracy:1.0
1907 952
1908 loss:2.45559112955e-05 accuracy:1.0
1909 953
1910 loss:0.28211170435 accuracy:0.75
1911 954
1912 loss:3.78486629415e-06 accuracy:1.0
1913 955
1914 loss:0.0 accuracy:1.0
1915 956
1916 loss:3.99325363105e-05 accuracy:1.0
1917 957
1918 loss:0.000164037759532 accuracy:1.0
1919 958
1920 loss:5.56377053726e-05 accuracy:1.0
1921 959
1922 loss:1.34110086947e-06 accuracy:1.0
1923 960
1924 loss:0.00719599006698 accuracy:1.0
1925 961
1926 loss:5.1224942581e-05 accuracy:1.0
1927 962
1928 loss:0.108724892139 accuracy:1.0
1929 963
1930 loss:0.216220155358 accuracy:0.75
1931 964
1932 loss:2.98023206113e-08 accuracy:1.0
1933 965
1934 loss:1.32020913952e-05 accuracy:1.0
1935 966
1936 loss:0.000549052318092 accuracy:1.0
1937 967
1938 loss:1.04307912352e-06 accuracy:1.0
1939 968
1940 loss:0.00227680965327 accuracy:1.0
1941 969
1942 loss:3.91273861169e-05 accuracy:1.0
1943 970
1944 loss:1.72852867308e-06 accuracy:1.0
1945 971
1946 loss:5.95975398028e-05 accuracy:1.0
1947 972
1948 loss:2.98023206113e-08 accuracy:1.0
1949 973
1950 loss:1.12845361233 accuracy:0.75
1951 974
1952 loss:0.00344220059924 accuracy:1.0
1953 975
1954 loss:1.78813891694e-07 accuracy:1.0
1955 976
1956 loss:0.00555469095707 accuracy:1.0
1957 977
1958 loss:0.0254180897027 accuracy:1.0
1959 978
1960 loss:0.00575304357335 accuracy:1.0
1961 979
1962 loss:8.87657224666e-05 accuracy:1.0
1963 980
1964 loss:0.333685457706 accuracy:0.75
1965 981
1966 loss:1.5411645174 accuracy:0.75
1967 982
1968 loss:0.00339213898405 accuracy:1.0
1969 983
1970 loss:0.00120935903396 accuracy:1.0
1971 984
1972 loss:0.000193331274204 accuracy:1.0
1973 985
1974 loss:0.00145972822793 accuracy:1.0
1975 986
1976 loss:0.0121605098248 accuracy:1.0
1977 987
1978 loss:0.000133721638122 accuracy:1.0
1979 988
1980 loss:5.54321013624e-06 accuracy:1.0
1981 989
1982 loss:0.0561847537756 accuracy:1.0
1983 990
1984 loss:2.12487684621e-05 accuracy:1.0
1985 991
1986 loss:0.00121289375238 accuracy:1.0
1987 992
1988 loss:7.45056922824e-07 accuracy:1.0
1989 993
1990 loss:0.000156591951963 accuracy:1.0
1991 994
1992 loss:0.00030597744626 accuracy:1.0
1993 995
1994 loss:0.00287289102562 accuracy:1.0
1995 996
1996 loss:0.00114007492084 accuracy:1.0
1997 997
1998 loss:0.000725160876755 accuracy:1.0
1999 998
2000 loss:1.57951808433e-06 accuracy:1.0
2001 999
2002 loss:0.00928597245365 accuracy:1.0
View Code

 

as_list

----------2018.02.19 在给定inception v4模型下继续训练---------------

参考官方手册和slim框架下实现inception

实验记录:

error 1:

InvalidArgumentError (see above for traceback): Assign requires shapes of both tensors to match. lhs shape= [5] rhs shape= [1001]

官方给的是1000分类的ImagNet, 自己的数据需要重新训练

error 2:

InvalidArgumentError (see above for traceback): Assign requires shapes of both tensors to match. lhs shape= [5] rhs shape= [1001]

因为用cpu跑,修改train_image_classfier.py里的tf.app.Flags.DEFINE_boolean('clone_on_cpu',False,'use CPUs to deploy clones.'),False改成True.

可视化:

调用tensorboard以后显示TensorBoard 1.5.1 at http://localhost:6006 (Press CTRL+C to quit),在浏览器打开http://localhost:6006即可

 

转载于:https://www.cnblogs.com/zhengmeisong/p/7751405.html

相关文章:

  • 获取class文件对象的三种方法
  • ALPHA-3
  • 关于Vue.js
  • mybatis(错误一) 项目启动时报“Result Maps collection already contains value forxxx”的解决方案...
  • Oracle分区表常见操作
  • shell设置时间递减脚本
  • linux网络流控-htb算法简析
  • Xpath语法
  • Java 基础知识
  • javaWeb中怎么获取提交表单里面的值
  • 阿里、腾讯、百度、华为、京东、搜狗和滴滴最新面试题汇集【转】
  • 《C程序设计语言》笔记 (二)类型运算符与表达式
  • 怎么在页面中引入外部模板
  • django--权限管理day1
  • Alpha 冲刺 (7/10)
  • [微信小程序] 使用ES6特性Class后出现编译异常
  • Docker 笔记(2):Dockerfile
  • ECMAScript 6 学习之路 ( 四 ) String 字符串扩展
  • git 常用命令
  • github从入门到放弃(1)
  • MySQL Access denied for user 'root'@'localhost' 解决方法
  • Vue 重置组件到初始状态
  • Windows Containers 大冒险: 容器网络
  • 关于Android中设置闹钟的相对比较完善的解决方案
  • 一起来学SpringBoot | 第十篇:使用Spring Cache集成Redis
  • 长三角G60科创走廊智能驾驶产业联盟揭牌成立,近80家企业助力智能驾驶行业发展 ...
  • ​软考-高级-系统架构设计师教程(清华第2版)【第15章 面向服务架构设计理论与实践(P527~554)-思维导图】​
  • #Lua:Lua调用C++生成的DLL库
  • #NOIP 2014# day.1 T3 飞扬的小鸟 bird
  • (2022版)一套教程搞定k8s安装到实战 | RBAC
  • (function(){})()的分步解析
  • (MATLAB)第五章-矩阵运算
  • (二)Pytorch快速搭建神经网络模型实现气温预测回归(代码+详细注解)
  • (附源码)springboot码头作业管理系统 毕业设计 341654
  • (附源码)ssm高校实验室 毕业设计 800008
  • (附源码)基于ssm的模具配件账单管理系统 毕业设计 081848
  • (一)u-boot-nand.bin的下载
  • (转)程序员技术练级攻略
  • .NET Framework杂记
  • .NET 使用 ILRepack 合并多个程序集(替代 ILMerge),避免引入额外的依赖
  • .NET开源项目介绍及资源推荐:数据持久层 (微软MVP写作)
  • [ HTML + CSS + Javascript ] 复盘尝试制作 2048 小游戏时遇到的问题
  • [Android]使用Retrofit进行网络请求
  • [Contest20180313]灵大会议
  • [CQOI 2011]动态逆序对
  • [C语言]——柔性数组
  • [FROM COM张]如何解决Nios II SBTE中出现的undefined reference to `xxx'警告
  • [FT]chatglm2微调
  • [jQuery]使用jQuery.Validate进行客户端验证(中级篇-上)——不使用微软验证控件的理由...
  • [LeetCode] Longest Common Prefix 字符串公有前序
  • [MRCTF2020]Ez_bypass1
  • [Oh My C++ Diary]类继承和类组合(内嵌类)初始化的不同
  • [PHP]加密解密函数
  • [Python GUI PyQt] PyQt5快速入门
  • [Remoting FAQ]Loading a Remoting Host On IIS得到BadImageFormatException