博客
关于我
强烈建议你试试无所不能的chatGPT,快点击我
神经网络实例
阅读量:4489 次
发布时间:2019-06-08

本文共 15660 字,大约阅读时间需要 52 分钟。

 

机器学习算法完整版见

皮马印第安人糖尿病

数据集

是对皮马印第安人糖尿病分类

代码

from keras.models import Sequential  from keras.layers import Dense  import numpy  # fix random seed for reproducibility  seed = 7  numpy.random.seed(seed)
Using TensorFlow backend.

数据

# load pima indians dataset  dataset = numpy.loadtxt("../DATA/pima-indians-diabetes.csv", delimiter=",")  # split into input (X) and output (Y) variables  X = dataset[:,0:8]  Y = dataset[:,8]

模型

# create model  model = Sequential()  model.add(Dense(12, input_dim=8, init='uniform', activation='relu'))  model.add(Dense(8, init='uniform', activation='relu'))  model.add(Dense(1, init='uniform', activation='sigmoid'))
C:\Users\htfeng\Anaconda3\lib\site-packages\ipykernel_launcher.py:3: UserWarning: Update your `Dense` call to the Keras 2 API: `Dense(12, input_dim=8, activation="relu", kernel_initializer="uniform")`  This is separate from the ipykernel package so we can avoid doing imports untilC:\Users\htfeng\Anaconda3\lib\site-packages\ipykernel_launcher.py:4: UserWarning: Update your `Dense` call to the Keras 2 API: `Dense(8, activation="relu", kernel_initializer="uniform")`  after removing the cwd from sys.path.C:\Users\htfeng\Anaconda3\lib\site-packages\ipykernel_launcher.py:5: UserWarning: Update your `Dense` call to the Keras 2 API: `Dense(1, activation="sigmoid", kernel_initializer="uniform")`  """

编译模型

model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

训练模型

# Fit the model  model.fit(X, Y, nb_epoch=100, batch_size=10)
Epoch 1/100768/768 [==============================] - 0s 108us/step - loss: 0.4920 - acc: 0.7734Epoch 2/100768/768 [==============================] - 0s 102us/step - loss: 0.4862 - acc: 0.7773Epoch 3/100 10/768 [..............................] - ETA: 0s - loss: 0.4446 - acc: 0.8000C:\Users\htfeng\Anaconda3\lib\site-packages\keras\models.py:939: UserWarning: The `nb_epoch` argument in `fit` has been renamed `epochs`.  warnings.warn('The `nb_epoch` argument in `fit` '768/768 [==============================] - 0s 100us/step - loss: 0.4871 - acc: 0.7682Epoch 4/100768/768 [==============================] - 0s 108us/step - loss: 0.4959 - acc: 0.7734Epoch 5/100768/768 [==============================] - 0s 94us/step - loss: 0.4904 - acc: 0.7734Epoch 6/100768/768 [==============================] - 0s 95us/step - loss: 0.4845 - acc: 0.7708Epoch 7/100768/768 [==============================] - 0s 95us/step - loss: 0.4841 - acc: 0.7799Epoch 8/100768/768 [==============================] - 0s 94us/step - loss: 0.4834 - acc: 0.7799Epoch 9/100768/768 [==============================] - 0s 92us/step - loss: 0.4747 - acc: 0.7839Epoch 10/100768/768 [==============================] - 0s 92us/step - loss: 0.4804 - acc: 0.7891Epoch 11/100768/768 [==============================] - 0s 93us/step - loss: 0.4763 - acc: 0.7734Epoch 12/100768/768 [==============================] - 0s 94us/step - loss: 0.4774 - acc: 0.7747Epoch 13/100768/768 [==============================] - 0s 97us/step - loss: 0.4910 - acc: 0.7591Epoch 14/100768/768 [==============================] - 0s 94us/step - loss: 0.4827 - acc: 0.7721Epoch 15/100768/768 [==============================] - 0s 98us/step - loss: 0.4789 - acc: 0.7904Epoch 16/100768/768 [==============================] - 0s 93us/step - loss: 0.4807 - acc: 0.7852Epoch 17/100768/768 [==============================] - 0s 96us/step - loss: 0.4715 - acc: 0.7813Epoch 18/100768/768 [==============================] - 0s 94us/step - loss: 0.4764 - acc: 0.7747Epoch 19/100768/768 [==============================] - 0s 92us/step - loss: 0.4806 - acc: 0.7852Epoch 20/100768/768 [==============================] - 0s 93us/step - loss: 0.4729 - acc: 0.7930Epoch 21/100768/768 [==============================] - 0s 93us/step - loss: 0.4722 - acc: 0.7708Epoch 22/100768/768 [==============================] - 0s 97us/step - loss: 0.4699 - acc: 0.7865Epoch 23/100768/768 [==============================] - 0s 94us/step - loss: 0.4959 - acc: 0.7617Epoch 24/100768/768 [==============================] - 0s 94us/step - loss: 0.4755 - acc: 0.7852Epoch 25/100768/768 [==============================] - 0s 93us/step - loss: 0.4681 - acc: 0.7760Epoch 26/100768/768 [==============================] - 0s 93us/step - loss: 0.4814 - acc: 0.7786Epoch 27/100768/768 [==============================] - 0s 94us/step - loss: 0.4728 - acc: 0.7891: 0s - loss: 0.4599 - acc: 0.803Epoch 28/100768/768 [==============================] - 0s 98us/step - loss: 0.4718 - acc: 0.7878Epoch 29/100768/768 [==============================] - 0s 93us/step - loss: 0.4680 - acc: 0.7786Epoch 30/100768/768 [==============================] - 0s 92us/step - loss: 0.4791 - acc: 0.7669Epoch 31/100768/768 [==============================] - 0s 90us/step - loss: 0.4675 - acc: 0.7878Epoch 32/100768/768 [==============================] - 0s 91us/step - loss: 0.4651 - acc: 0.7826Epoch 33/100768/768 [==============================] - 0s 92us/step - loss: 0.4703 - acc: 0.7760Epoch 34/100768/768 [==============================] - 0s 95us/step - loss: 0.4680 - acc: 0.7708Epoch 35/100768/768 [==============================] - 0s 94us/step - loss: 0.4662 - acc: 0.7865Epoch 36/100768/768 [==============================] - 0s 95us/step - loss: 0.4691 - acc: 0.7891Epoch 37/100768/768 [==============================] - 0s 95us/step - loss: 0.4637 - acc: 0.7930Epoch 38/100768/768 [==============================] - 0s 91us/step - loss: 0.4622 - acc: 0.7917Epoch 39/100768/768 [==============================] - 0s 96us/step - loss: 0.4684 - acc: 0.7786Epoch 40/100768/768 [==============================] - 0s 92us/step - loss: 0.4703 - acc: 0.7799Epoch 41/100768/768 [==============================] - 0s 91us/step - loss: 0.4632 - acc: 0.7865Epoch 42/100768/768 [==============================] - 0s 91us/step - loss: 0.4665 - acc: 0.7812Epoch 43/100768/768 [==============================] - 0s 91us/step - loss: 0.4673 - acc: 0.7917Epoch 44/100768/768 [==============================] - 0s 94us/step - loss: 0.4612 - acc: 0.7891Epoch 45/100768/768 [==============================] - 0s 92us/step - loss: 0.4576 - acc: 0.8008Epoch 46/100768/768 [==============================] - 0s 92us/step - loss: 0.4670 - acc: 0.7734Epoch 47/100768/768 [==============================] - 0s 92us/step - loss: 0.4589 - acc: 0.7708Epoch 48/100768/768 [==============================] - 0s 91us/step - loss: 0.4692 - acc: 0.7878Epoch 49/100768/768 [==============================] - 0s 92us/step - loss: 0.4575 - acc: 0.7747Epoch 50/100768/768 [==============================] - 0s 94us/step - loss: 0.4559 - acc: 0.7969Epoch 51/100768/768 [==============================] - 0s 94us/step - loss: 0.4635 - acc: 0.7852Epoch 52/100768/768 [==============================] - 0s 93us/step - loss: 0.4630 - acc: 0.7930Epoch 53/100768/768 [==============================] - 0s 92us/step - loss: 0.4610 - acc: 0.7904Epoch 54/100768/768 [==============================] - 0s 94us/step - loss: 0.4643 - acc: 0.7812Epoch 55/100768/768 [==============================] - 0s 96us/step - loss: 0.4569 - acc: 0.7865Epoch 56/100768/768 [==============================] - 0s 93us/step - loss: 0.4555 - acc: 0.7865: 0s - loss: 0.4327 - acc: 0.807Epoch 57/100768/768 [==============================] - 0s 94us/step - loss: 0.4734 - acc: 0.7643Epoch 58/100768/768 [==============================] - 0s 96us/step - loss: 0.4617 - acc: 0.7839Epoch 59/100768/768 [==============================] - 0s 96us/step - loss: 0.4562 - acc: 0.7852Epoch 60/100768/768 [==============================] - 0s 94us/step - loss: 0.4486 - acc: 0.7891Epoch 61/100768/768 [==============================] - 0s 98us/step - loss: 0.4519 - acc: 0.7956Epoch 62/100768/768 [==============================] - 0s 96us/step - loss: 0.4538 - acc: 0.7891Epoch 63/100768/768 [==============================] - 0s 100us/step - loss: 0.4540 - acc: 0.7969Epoch 64/100768/768 [==============================] - 0s 94us/step - loss: 0.4529 - acc: 0.7969Epoch 65/100768/768 [==============================] - 0s 96us/step - loss: 0.4541 - acc: 0.7891Epoch 66/100768/768 [==============================] - 0s 98us/step - loss: 0.4606 - acc: 0.7839Epoch 67/100768/768 [==============================] - 0s 93us/step - loss: 0.4547 - acc: 0.7930Epoch 68/100768/768 [==============================] - 0s 93us/step - loss: 0.4448 - acc: 0.7969Epoch 69/100768/768 [==============================] - 0s 94us/step - loss: 0.4460 - acc: 0.7956Epoch 70/100768/768 [==============================] - 0s 94us/step - loss: 0.4469 - acc: 0.7982Epoch 71/100768/768 [==============================] - 0s 91us/step - loss: 0.4625 - acc: 0.7891Epoch 72/100768/768 [==============================] - 0s 93us/step - loss: 0.4589 - acc: 0.7799Epoch 73/100768/768 [==============================] - 0s 92us/step - loss: 0.4570 - acc: 0.7904Epoch 74/100768/768 [==============================] - 0s 94us/step - loss: 0.4508 - acc: 0.8060Epoch 75/100768/768 [==============================] - 0s 97us/step - loss: 0.4496 - acc: 0.8008Epoch 76/100768/768 [==============================] - 0s 102us/step - loss: 0.4633 - acc: 0.7865Epoch 77/100768/768 [==============================] - 0s 104us/step - loss: 0.4465 - acc: 0.7826Epoch 78/100768/768 [==============================] - 0s 96us/step - loss: 0.4582 - acc: 0.7773Epoch 79/100768/768 [==============================] - 0s 100us/step - loss: 0.4453 - acc: 0.8008Epoch 80/100768/768 [==============================] - 0s 101us/step - loss: 0.4588 - acc: 0.7865Epoch 81/100768/768 [==============================] - 0s 96us/step - loss: 0.4470 - acc: 0.7786Epoch 82/100768/768 [==============================] - 0s 96us/step - loss: 0.4477 - acc: 0.7878Epoch 83/100768/768 [==============================] - 0s 95us/step - loss: 0.4577 - acc: 0.7943Epoch 84/100768/768 [==============================] - 0s 91us/step - loss: 0.4485 - acc: 0.7839Epoch 85/100768/768 [==============================] - 0s 96us/step - loss: 0.4456 - acc: 0.7891Epoch 86/100768/768 [==============================] - 0s 94us/step - loss: 0.4465 - acc: 0.7917Epoch 87/100768/768 [==============================] - 0s 91us/step - loss: 0.4618 - acc: 0.7878Epoch 88/100768/768 [==============================] - 0s 91us/step - loss: 0.4530 - acc: 0.7943Epoch 89/100768/768 [==============================] - 0s 91us/step - loss: 0.4471 - acc: 0.7799Epoch 90/100768/768 [==============================] - 0s 93us/step - loss: 0.4574 - acc: 0.7839Epoch 91/100768/768 [==============================] - 0s 91us/step - loss: 0.4453 - acc: 0.8034Epoch 92/100768/768 [==============================] - 0s 89us/step - loss: 0.4521 - acc: 0.7839Epoch 93/100768/768 [==============================] - 0s 94us/step - loss: 0.4568 - acc: 0.7813Epoch 94/100768/768 [==============================] - 0s 94us/step - loss: 0.4650 - acc: 0.7643Epoch 95/100768/768 [==============================] - 0s 95us/step - loss: 0.4641 - acc: 0.7904Epoch 96/100768/768 [==============================] - 0s 94us/step - loss: 0.4508 - acc: 0.7852Epoch 97/100768/768 [==============================] - 0s 95us/step - loss: 0.4380 - acc: 0.7878Epoch 98/100768/768 [==============================] - 0s 93us/step - loss: 0.4509 - acc: 0.7930Epoch 99/100768/768 [==============================] - 0s 92us/step - loss: 0.4449 - acc: 0.7904Epoch 100/100768/768 [==============================] - 0s 93us/step - loss: 0.4421 - acc: 0.7995

评估模型

# evaluate the model  scores = model.evaluate(X, Y)  print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))
768/768 [==============================] - 0s 20us/stepacc: 80.99%

预测

# calculate predictions  predictions = model.predict(X)  # round predictions  rounded = [round(x) for x in predictions.ravel()]  print(rounded)
[1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 1.0, 1.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0]
 

转载于:https://www.cnblogs.com/htfeng/p/9931748.html

你可能感兴趣的文章
Java自动计算表格某一数字列的和(2)
查看>>
详解2进制,10进制,16进制,8进制,36进制
查看>>
TCP header
查看>>
调查问卷Html5发展综述
查看>>
iPad 3g版完美实现打电话功能(phoneitipad破解)
查看>>
数据结构与算法之递推算法 C++与PHP实现
查看>>
VB连接Mysql数据库
查看>>
UOJ356 [JOI2017春季合宿] Port Facility 【启发式合并】【堆】【并查集】
查看>>
CSS实现背景透明,文字不透明(各浏览器兼容)
查看>>
JZOJ 3055. 【NOIP2012模拟10.27】比赛
查看>>
[图算法] 1003. Emergency (25)
查看>>
关于宏定义的一些用法
查看>>
WINDOWS NT/2000下如何屏蔽CTRL+ALT+DEL
查看>>
思百德全区播放的个人见解及B区ISO破除区码播放教程
查看>>
Delphi的命令行编译命令
查看>>
BZOJ 1901 Zju2112 Dynamic Rankings 题解
查看>>
java线程死锁研究
查看>>
C++虚析构函数
查看>>
hdu2544---最短路
查看>>
[HAOI2006] 聪明的猴子
查看>>