600字范文,内容丰富有趣,生活中的好帮手!
600字范文 > 《Python金融大数据风控建模实战》 第15章 神经网络模型

《Python金融大数据风控建模实战》 第15章 神经网络模型

时间:2019-08-20 09:39:35

相关推荐

《Python金融大数据风控建模实战》 第15章 神经网络模型

《Python金融大数据风控建模实战》 第15章 神经网络模型

本章引言Python代码实现及注释

本章引言

神经网络模型是深度学习的基础。 从神经网络的结构中可以发现,模型的未知参数就是一系列权重值,网络结构越复杂其非线性表达能力越强,同时需要学习的权重就越多。误差反向传播算法(error BackPropagation,BP算法)是神经网络的学习策略中最著名的算法代表,不仅用于前馈神经网络的学习,还可以用于其他类型的神经网络,如递归神经网络的训练,而且在深度学习中也是采用BP算法进行网络训练的。

Python代码实现及注释

# 第15章:神经网络模型import osimport sys#path = __file__#path = os.path.abspath(path + ((os.sep + '..') * 2))#sys.path.append(path)import pandas as pdimport numpy as npfrom sklearn.model_selection import train_test_splitimport variable_encode as var_encodefrom sklearn.metrics import confusion_matrix,recall_score, auc, roc_curve,precision_score,accuracy_scorefrom sklearn.neural_network import MLPClassifierfrom sklearn.preprocessing import StandardScalerimport matplotlib.pyplot as pltimport matplotlibmatplotlib.rcParams['font.sans-serif']=['SimHei'] # 用黑体显示中文matplotlib.rcParams['axes.unicode_minus']=False# 正常显示负号import warningswarnings.filterwarnings("ignore") ##忽略警告##数据读取def data_read(data_path,file_name):df = pd.read_csv( os.path.join(data_path, file_name), delim_whitespace = True, header = None )##变量重命名columns = ['status_account','duration','credit_history','purpose', 'amount','svaing_account', 'present_emp', 'income_rate', 'personal_status','other_debtors', 'residence_info', 'property', 'age','inst_plans', 'housing', 'num_credits','job', 'dependents', 'telephone', 'foreign_worker', 'target']df.columns = columns##将标签变量由状态1,2转为0,1;0表示好用户,1表示坏用户df.target = df.target - 1##数据分为data_train和 data_test两部分,训练集用于得到编码函数,验证集用已知的编码规则对验证集编码data_train, data_test = train_test_split(df, test_size=0.2, random_state=0,stratify=df.target)return data_train, data_test##离散变量与连续变量区分 def category_continue_separation(df,feature_names):categorical_var = []numerical_var = []if 'target' in feature_names:feature_names.remove('target')##先判断类型,如果是int或float就直接作为连续变量numerical_var = list(df[feature_names].select_dtypes(include=['int','float','int32','float32','int64','float64']).columns.values)categorical_var = [x for x in feature_names if x not in numerical_var]return categorical_var,numerical_varif __name__ == '__main__':path = 'D:\\code\\chapter15'data_path = os.path.join(path ,'data')file_name = 'german.csv'##读取数据data_train, data_test = data_read(data_path,file_name)sum(data_train.target ==0)data_train.target.sum()##区分离散变量与连续变量feature_names = list(data_train.columns)feature_names.remove('target')categorical_var,numerical_var = category_continue_separation(data_train,feature_names)###离散变量直接WOE编码var_all_bin = list(data_train.columns)var_all_bin.remove('target')##训练集WOE编码df_train_woe, dict_woe_map, dict_iv_values ,var_woe_name = var_encode.woe_encode(data_train,data_path,categorical_var, data_train.target,'dict_woe_map', flag='train')##测试集WOE编码df_test_woe, var_woe_name = var_encode.woe_encode(data_test,data_path,categorical_var, data_test.target, 'dict_woe_map',flag='test')#####连续变量缺失值做填补for i in numerical_var:if sum(data_train[i].isnull()) >0:data_train[i].fillna(data_train[i].mean(),inplace=True)###组成分箱后的训练集与测试集data_train.reset_index(drop=True,inplace=True)data_test.reset_index(drop=True,inplace=True)var_1 = numerical_varvar_1.append('target')data_train_1 = pd.concat([df_train_woe[var_woe_name],data_train[var_1]],axis=1)data_test_1 = pd.concat([df_test_woe[var_woe_name],data_test[var_1]],axis=1) ####取出训练数据与测试数据var_all = list(data_train_1.columns)var_all.remove('target')####变量归一化scaler = StandardScaler().fit(data_train_1[var_all])data_train_1[var_all] = scaler.transform(data_train_1[var_all]) data_test_1[var_all] = scaler.transform(data_test_1[var_all])x_train = np.array(data_train_1[var_all])y_train = np.array(data_train_1.target)x_test = np.array(data_test_1[var_all])y_test = np.array(data_test_1.target)########神经网络模型##网络初始化nn_model = MLPClassifier(hidden_layer_sizes=(50,50),activation='relu',max_iter=300,alpha=0.01)##神经网络模型训练nn_model_fit = nn_model.fit(x_train, y_train)# ##属性# nn_model_fit.loss_# ss = nn_model_fit.coefs_# ss = nn_model_fit.intercepts_# nn_model_fit.n_layers_# nn_model_fit.n_outputs_##模型预测y_pred = nn_model_fit.predict(x_test)y_score_test = nn_model_fit.predict_proba(x_test)[:, 1]##计算混淆矩阵与recall、precisioncnf_matrix = confusion_matrix(y_test, y_pred)recall_value = recall_score(y_test, y_pred)precision_value = precision_score(y_test, y_pred)acc = accuracy_score(y_test, y_pred)print(cnf_matrix)print('Validation set: model recall is {0},and percision is {1}'.format(recall_value,precision_value)) ##计算fpr与tprfpr, tpr, thresholds = roc_curve(y_test, y_score_test)####计算AR。gini等roc_auc = auc(fpr, tpr)ks = max(tpr - fpr)ar = 2*roc_auc-1gini = arprint('test set: model AR is {0},and ks is {1}'.format(ar,ks)) ####ks曲线plt.figure(figsize=(10,6))fontsize_1 = 12plt.plot(np.linspace(0,1,len(tpr)),tpr,'--',color='black', label='正样本洛伦兹曲线')plt.plot(np.linspace(0,1,len(tpr)),fpr,':',color='black', label='负样本洛伦兹曲线')plt.plot(np.linspace(0,1,len(tpr)),tpr - fpr,'-',color='grey')plt.grid()plt.xticks( fontsize=fontsize_1)plt.yticks( fontsize=fontsize_1)plt.xlabel('概率分组',fontsize=fontsize_1)plt.ylabel('累积占比%',fontsize=fontsize_1)plt.legend(fontsize=fontsize_1)print( max(tpr - fpr))

本内容不代表本网观点和政治立场,如有侵犯你的权益请联系我们处理。
网友评论
网友评论仅供其表达个人看法,并不表明网站立场。