您当前的位置: 首页 >  ar

段智华

暂无认证

  • 3浏览

    0关注

    1232博文

    0收益

  • 0浏览

    0点赞

    0打赏

    0留言

私信
关注
热门博文

(17) 运行ForwardPropagation.py代码

段智华 发布时间:2019-04-01 06:20:21 ,浏览量:3

Create_AI_Framework_In5Classes(Day2) 的ForwardPropagation.py代码:

# -*- coding: utf-8 -*-#完成从Input Layer,经过若干层的Hidden Layers,最后得出Output Layer的值import mathclass ForwardPropagation:        def applyForwardPropagation(nodes, weights, instance):        for i in range(len(nodes)):            if nodes[i].get_is_bias_unit() == True:                nodes[i].set_value = 1                                  # 把数据输入到Input Layer        #例如说处理instance = [0,1,1]                     for j in range(len(instance) - 1): #训练的时候只需要features            value_of_feature = instance[j] #获得该条数据中每个Feature具体的值                                    for k in range(len(nodes)):                                if j + 1 == nodes[k].get_index(): #索引为0的节点为Bias,所以从索引为1的Node开始                    nodes[k].set_value(value_of_feature)                        #Hidden Layer的处理        for j in range(len(nodes)):            if nodes[j].get_is_bias_unit() == False and nodes[j].get_level() > 0 :                                target_neuron_input = 0 #接受上一个Layer中所有的和自己相关的Neurons和Weights的乘积之和                target_neuron_output = 0 #经过Non-Linearity后的输出,我们这里使用Sigmoid                #获得当前Neuron的ID                target_index = nodes[j].get_index()                                for k in range(len(weights)):                    #获得和当前的Neuron关联的Weight                    if target_index == weights[k].get_to_index():                        #获得该Weight的Value                        weight_value = weights[k].get_value()                        #获得该Weight的来源的Neuron的ID                        from_index = weights[k].get_from_index()                                                #获得该来源Neuron的Value                        for m in range(len(nodes)):                            #获得该ID的Neuron                            if from_index == nodes[m].get_index():                                #获得该Neuron的具体的Value                               value_from_neuron = nodes[m].get_value()                                                              #把Weight和相应的Value相乘,然后累加                               target_neuron_input = target_neuron_input + (weight_value * value_from_neuron)                                                              #一个Weight只连接一个上一个Layer的Neuron,所以不需要继续循环真个神经网络的其它Neurons'                               break                                           #从和Break对其出发,一共按了4次后退键,因为接下来是要应用Sigmoid多当前的Neuron的所有的输入累计后的值进行操作                target_neuron_output = 1 / (1 + math.exp(- target_neuron_input))                                #接下来把输入值和当然Neuron采用Sigmoid Activation计算后的值设置进当前的Neuron                nodes[j].set_input_value(target_neuron_input)                nodes[j].set_value(target_neuron_output) 

 

(3)   测试结果。

在入口程序Neuron_Network_Entry.py中测试结果。输出结果的值是最后一个节点的值,

得出此次Forward Propagation的输出结果,从所有的节点中找到最后一个节点,打印出预测的值。

       Neuron_Network_Entry.py运行结果如下:

+1      V1      V2      Hidden layer creation: 1        N[1][1]         N[1][2]         N[1][3]         N[1][4]         N[1][5]         N[1][6]         N[1][7]         N[1][8]         Hidden layer creation: 2        N[2][1]         N[2][2]         N[2][3]         N[2][4]         Hidden layer creation: 3        N[3][1]         N[3][2]         Output layer:  OutputThe weight from 1 at layers[0] to 4 at layers[1] : -0.01719299063527102The weight from 1 at layers[0] to 5 at layers[1] : -0.8524173292229386The weight from 1 at layers[0] to 6 at layers[1] : 0.22699060934105253The weight from 1 at layers[0] to 7 at layers[1] : -0.18342643007293868The weight from 1 at layers[0] to 8 at layers[1] : 0.535174965756674The weight from 1 at layers[0] to 9 at layers[1] : -0.14676978791733708The weight from 1 at layers[0] to 10 at layers[1] : 0.3575707340850214The weight from 1 at layers[0] to 11 at layers[1] : -0.42137618665671717The weight from 2 at layers[0] to 4 at layers[1] : -0.8009065486052088The weight from 2 at layers[0] to 5 at layers[1] : 0.058917238487063095The weight from 2 at layers[0] to 6 at layers[1] : -0.42346508944034544The weight from 2 at layers[0] to 7 at layers[1] : 0.8426870154158392The weight from 2 at layers[0] to 8 at layers[1] : 0.32010217521550643The weight from 2 at layers[0] to 9 at layers[1] : -0.18659699268657703The weight from 2 at layers[0] to 10 at layers[1] : -0.21967241566753914The weight from 2 at layers[0] to 11 at layers[1] : -0.24400451550197744The weight from 4 at layers[1] to 13 at layers[2] : 1.012406950277446The weight from 4 at layers[1] to 14 at layers[2] : -0.7119667051463217The weight from 4 at layers[1] to 15 at layers[2] : 0.6123794505814086The weight from 4 at layers[1] to 16 at layers[2] : 0.20933909060981204The weight from 5 at layers[1] to 13 at layers[2] : 0.8295825393038667The weight from 5 at layers[1] to 14 at layers[2] : -0.18589793075961192The weight from 5 at layers[1] to 15 at layers[2] : -0.4965519410696049The weight from 5 at layers[1] to 16 at layers[2] : 0.8986794993436826The weight from 6 at layers[1] to 13 at layers[2] : -0.5419190030559935The weight from 6 at layers[1] to 14 at layers[2] : -0.030482481689729557The weight from 6 at layers[1] to 15 at layers[2] : -0.16049573458078903The weight from 6 at layers[1] to 16 at layers[2] : -0.2974003908293369The weight from 7 at layers[1] to 13 at layers[2] : 0.2390039732386664The weight from 7 at layers[1] to 14 at layers[2] : 0.5368392670597157The weight from 7 at layers[1] to 15 at layers[2] : -0.38067640003252334The weight from 7 at layers[1] to 16 at layers[2] : 0.08850351612527696The weight from 8 at layers[1] to 13 at layers[2] : -0.45517979672262143The weight from 8 at layers[1] to 14 at layers[2] : -0.48321818662131666The weight from 8 at layers[1] to 15 at layers[2] : 0.5723650651069483The weight from 8 at layers[1] to 16 at layers[2] : 0.20266673558402148The weight from 9 at layers[1] to 13 at layers[2] : 0.4648137370852401The weight from 9 at layers[1] to 14 at layers[2] : -0.9281727938087562The weight from 9 at layers[1] to 15 at layers[2] : -0.3137211374881055The weight from 9 at layers[1] to 16 at layers[2] : -0.9786522293599609The weight from 10 at layers[1] to 13 at layers[2] : 0.7513533393983693The weight from 10 at layers[1] to 14 at layers[2] : 0.6710274413316895The weight from 10 at layers[1] to 15 at layers[2] : 0.9971668767938549The weight from 10 at layers[1] to 16 at layers[2] : 0.34557762622192767The weight from 11 at layers[1] to 13 at layers[2] : -0.8777163395171548The weight from 11 at layers[1] to 14 at layers[2] : -0.05999121610430269The weight from 11 at layers[1] to 15 at layers[2] : -0.6652778238381963The weight from 11 at layers[1] to 16 at layers[2] : 0.08832878707869818The weight from 13 at layers[2] to 18 at layers[3] : -0.8324236408834976The weight from 13 at layers[2] to 19 at layers[3] : -0.7635884380604687The weight from 14 at layers[2] to 18 at layers[3] : 0.8092754640273179The weight from 14 at layers[2] to 19 at layers[3] : -0.09275930457566717The weight from 15 at layers[2] to 18 at layers[3] : -0.6719236795958695The weight from 15 at layers[2] to 19 at layers[3] : -0.8016890418335867The weight from 16 at layers[2] to 18 at layers[3] : 1.0029644245873488The weight from 16 at layers[2] to 19 at layers[3] : -0.8780918045481161The weight from 18 at layers[3] to 20 at layers[4] : -0.8007053462211116The weight from 19 at layers[3] to 20 at layers[4] : 0.9594412274529027Prediction: 0.4477145403917822Prediction: 0.44384068472048943Prediction: 0.44955657205136573Prediction: 0.44559590058743986 

instances中有4条数据,从输入数据的角度,第1条数据是[0,0],第2条数据是          

[0,1],第3条数据是[1,0],第4条数据是[1,1]。结果中第一条数据的预测值是Prediction: 0.4477145403917822,但实际结果是0;第2条数据的预测值是Prediction: 0.44384068472048943,但实际结果是1;第3条数据的预测值是Prediction:0.44955657205136573,但实际结果是1;第4条数据的预测值是Prediction: 0.44559590058743986,但实际结果是0;我们只运行了1次,使用Sigmoid激活函数发现第1列和第2列跟第3列结果的关系,第1次运行的结果是不准确的。在下一章节中,我们将从结果中反推执行的过程,看每个节点关联的权重对误差造成的影响,根据我们的算法调整这个影响,随着循环过程的进行,误差会越来越小。TensorFlow的可视化图中,开始的误差是50%左右,我们这里的开始误差也很大,根据结果的计算,要么是0,要么是1,而我们的预测值如0.4477145403917822等,随着程序的运行,误差越来越小,而要完成这个过程,要根据结果反推以前的每一步,看权重值是否需要调整,调整权重之后,下一次再运行,神经元具体的值也会改变,因为权重发生了改变,不断的调整将越来越接近目标。

 

关注
打赏
1659361485
查看更多评论
立即登录/注册

微信扫码登录

0.1392s