前言
最近学习了一新概念,叫科学发现和科技发明,科学发现是高于科技发明的,而这个说法我觉得还是挺有道理的,我们总说中国的科技不如欧美,但我们实际感觉上,不论建筑,硬件还是软件,理论,我们都已经高于欧美了,那为什么还说我们不如欧美呢?
科学发现是高于科技发明就很好的解释了这个问题,即,我们的在线支付,建筑行业等等,这些都是科技发明,而不是科学发现,而科学发现是引领科技发明的,而欧美在科学发现上远远领先我们,科技发明上虽然领先的不多,但也有很多大幅领先的,比如chatgpt。
说这些的主要目的是想说明,软件开发也是科技发明,所以这个行业的高手,再高的水平,也就那么回事。
也就是说,即便你是清北的,一旦你进入科技发明的队伍,那也就那么回事了。
现在的硕士博士90%都是在研究科技发明的方向,也就是说绝大部分硕士博士都将是工人。
神经网络并不难,我的这个系列文章就证明了,你完全不会python,完全没学过算法,一样可以在短时间内学会。
我个人感觉,一周到一个月之内,都能学会。
但这个东西是很多研究生博士在学的东西,那也就是说,普通的研究生和博士生最后搞出来的论文,其实就是一个星期的知识,而具体的深度理论,他们也只是知道个模糊的概念,这是因为研究生导师和博士生导师本身也就只知道个模糊的概念。
那最后的结果就是,从学者的角度来看,这些人都不能算搞学术的。从工人的角度来看,这些人都不具备熟练的技能。
然而,这些人如果未进入甲方,那最终都是要走向工人岗位,那就出现了一个现象,清北毕业生跟高中生是一个起跑线,在熟练的工人眼中,他们都是一张白纸。
而至于谁能在短时间内熟练掌握这一个星期的知识和技能的运用,根本与出身无关。
所以,只要都在工人这个领域,大家都是一样的。
会的不必高人一等的看别人,不会的也不用觉得人家是高水平。
本文内容
本文主要介绍结合神经网络进行机器人开发。
准备工作
运行代码前,我们需要先下载nltk包。
首先安装nltk的包。
pip install nltk
然后下载nltk工具,编写一个py文件,写代码如下:
import nltk
nltk.download()
然后使用管理员打开cmd,运行这个py文件。
C:\Project\python_test\github\PythonTest\venv\Scripts\python.exe C:\Project\python_test\github\PythonTest\robot_nltk\dlnltk.py
然后弹出界面如下,修改保存地址:
PS:有资料说可以直接运行 nltk.download('punkt') ,下载我们需要的指定的包,但我没下载成功,我还是全部下载了。
# nltk.download('punkt') #是 NLTK (Natural Language Toolkit) 库中的一个命令,用来下载名为 'punkt' 的资源,通常用于 分词(Tokenization)
# nltk.download('popular') #命令会下载 NLTK 中大部分常用的资源,比punkt的资源更多
代码编写
编写model
首先编写一个NeuralNet(model.py)如下:
import torch.nn as nn
class NeuralNet(nn.Module):def __init__(self, input_size, hidden_size, num_classes):super(NeuralNet, self).__init__()self.l1 = nn.Linear(input_size, hidden_size) self.l2 = nn.Linear(hidden_size, hidden_size) self.l3 = nn.Linear(hidden_size, num_classes)self.relu = nn.ReLU()def forward(self, x):out = self.l1(x)out = self.relu(out)out = self.l2(out)out = self.relu(out)out = self.l3(out)# no activation and no softmax at the endreturn out
然后编写一个工具nltk_utils.py如下:
import numpy as np
import nltkfrom nltk.stem.porter import PorterStemmer
stemmer = PorterStemmer()def tokenize(sentence):return nltk.word_tokenize(sentence)def stem(word):return stemmer.stem(word.lower())def bag_of_words(tokenized_sentence, words):sentence_words = [stem(word) for word in tokenized_sentence]bag = np.zeros(len(words), dtype=np.float32)for idx, w in enumerate(words):if w in sentence_words: bag[idx] = 1return baga="How long does shipping take?"
print(a)
a = tokenize(a)
print(a)
wenjuan.com/s/UZBZJvO8SL/
wenjuan.com/s/UZBZJvw62cQ/
wenjuan.com/s/UZBZJvU8lj3/
wenjuan.com/s/UZBZJvqZ26/
wenjuan.com/s/UZBZJv1DMy/
wenjuan.com/s/UZBZJvwbEvx/
wenjuan.com/s/UZBZJvzRrMp/
wenjuan.com/s/UZBZJvccUry/
wenjuan.com/s/UZBZJvInzX/
wenjuan.com/s/UZBZJvYY6Y1/
wenjuan.com/s/UZBZJvyQb3/
wenjuan.com/s/UZBZJvN5Tw/
wenjuan.com/s/UZBZJvtINt/
wenjuan.com/s/UZBZJvYD4Y/
wenjuan.com/s/UZBZJvEHfqg/
wenjuan.com/s/UZBZJvHlAE/
wenjuan.com/s/UZBZJv8vYz/
wenjuan.com/s/UZBZJvyzJlC/
wenjuan.com/s/UZBZJvRioOf/
wenjuan.com/s/UZBZJv3doj/
wenjuan.com/s/UZBZJvrziK/
wenjuan.com/s/UZBZJvUtkh/
wenjuan.com/s/UZBZJvMccz/
wenjuan.com/s/UZBZJvIjunc/
wenjuan.com/s/UZBZJvgvnJW/
wenjuan.com/s/UZBZJvAMjL4/
wenjuan.com/s/UZBZJvahZX3/
wenjuan.com/s/UZBZJvQV5y/
wenjuan.com/s/UZBZJvdWZD/
wenjuan.com/s/UZBZJvbmNET/
wenjuan.com/s/UZBZJvrQ1g/
wenjuan.com/s/UZBZJvSPJM/
wenjuan.com/s/UZBZJvU1V3h/
wenjuan.com/s/UZBZJvh4WQE/
wenjuan.com/s/UZBZJvjvJyB/
wenjuan.com/s/UZBZJv4sowu/
wenjuan.com/s/UZBZJvhp1F/
wenjuan.com/s/UZBZJvpU5a/
wenjuan.com/s/UZBZJv2Gar/
wenjuan.com/s/UZBZJvv59f/
wenjuan.com/s/UZBZJvmFYI/
wenjuan.com/s/UZBZJvnn6Ex/
wenjuan.com/s/UZBZJvPMov/
wenjuan.com/s/UZBZJvgDF02/
wenjuan.com/s/UZBZJv3WY2n/
wenjuan.com/s/UZBZJvNl9s/
wenjuan.com/s/UZBZJv7ykN/
wenjuan.com/s/UZBZJvgbTNb/
wenjuan.com/s/UZBZJveKDBL/
wenjuan.com/s/UZBZJvTUjFN/
wenjuan.com/s/UZBZJvibjE/
wenjuan.com/s/UZBZJvRolB7/
wenjuan.com/s/UZBZJvrb8br/
wenjuan.com/s/UZBZJv0r7i0/
wenjuan.com/s/UZBZJvDTWA/
wenjuan.com/s/UZBZJv5gAgS/
wenjuan.com/s/UZBZJv6bfs/
wenjuan.com/s/UZBZJvXDwrp/
wenjuan.com/s/UZBZJviPQF/
wenjuan.com/s/UZBZJvWJQLr/
wenjuan.com/s/UZBZJvunCYy/
wenjuan.com/s/UZBZJvGtTZ/
wenjuan.com/s/UZBZJvI1TmW/
wenjuan.com/s/UZBZJveFIdP/
wenjuan.com/s/UZBZJvnXmkh/
wenjuan.com/s/UZBZJv1xM8/
wenjuan.com/s/UZBZJvCKOpf/
wenjuan.com/s/UZBZJvU2WWs/
wenjuan.com/s/UZBZJvEClx/
wenjuan.com/s/UZBZJvFLqNL/
wenjuan.com/s/UZBZJvztGs/
wenjuan.com/s/UZBZJvRe6Dc/
wenjuan.com/s/UZBZJvbY7YT/
wenjuan.com/s/UZBZJvXx3Jq/
wenjuan.com/s/UZBZJvyonn/
wenjuan.com/s/UZBZJvoxFPh/
wenjuan.com/s/UZBZJvlysWc/
wenjuan.com/s/UZBZJvFlMea/
wenjuan.com/s/UZBZJvpeCb/
wenjuan.com/s/UZBZJvahqR8/
wenjuan.com/s/UZBZJvRWwR/
wenjuan.com/s/UZBZJveRqzy/
wenjuan.com/s/UZBZJvYJxda/
wenjuan.com/s/UZBZJvHZsa/
wenjuan.com/s/UZBZJvmmvKC/
wenjuan.com/s/UZBZJvHG4n/
wenjuan.com/s/UZBZJvq9iIL/
wenjuan.com/s/UZBZJv2pcKS/
wenjuan.com/s/UZBZJvvXcy/
wenjuan.com/s/UZBZJvyipk/
wenjuan.com/s/UZBZJvBzhKy/
wenjuan.com/s/UZBZJvrmOe/
wenjuan.com/s/UZBZJvRPDPZ/
wenjuan.com/s/UZBZJvRnnMK/
wenjuan.com/s/UZBZJvN4fgb/
wenjuan.com/s/UZBZJvwsGT8/
wenjuan.com/s/UZBZJvVgOnr/
wenjuan.com/s/UZBZJvu1rj/
wenjuan.com/s/UZBZJvOAzj/
wenjuan.com/s/UZBZJvga7ii/
wenjuan.com/s/UZBZJvvrwWk/
wenjuan.com/s/UZBZJvjmL9Q/
wenjuan.com/s/UZBZJv1oA0u/
wenjuan.com/s/UZBZJvV0R8G/
wenjuan.com/s/UZBZJvKpoVO/
wenjuan.com/s/UZBZJvuISG/
wenjuan.com/s/UZBZJverFK3/
wenjuan.com/s/UZBZJvI0Wfk/
wenjuan.com/s/UZBZJvsHT5/
wenjuan.com/s/UZBZJv71z39/
wenjuan.com/s/UZBZJvQmulx/
wenjuan.com/s/UZBZJvyH8c/
wenjuan.com/s/UZBZJvemuB/
wenjuan.com/s/UZBZJvitoJu/
wenjuan.com/s/UZBZJvf52AR/
wenjuan.com/s/UZBZJv0rQTz/
wenjuan.com/s/UZBZJvQlfcX/
wenjuan.com/s/UZBZJvdIdSw/
wenjuan.com/s/UZBZJvwUDR/
wenjuan.com/s/UZBZJvGPoa/
wenjuan.com/s/UZBZJvkF8J/
wenjuan.com/s/UZBZJvs3Em/
wenjuan.com/s/UZBZJvftyf/
wenjuan.com/s/UZBZJv8v5K/
wenjuan.com/s/UZBZJvPdTRv/
wenjuan.com/s/UZBZJvuYJPC/
wenjuan.com/s/UZBZJv1yY1/
wenjuan.com/s/UZBZJvKWpTd/
wenjuan.com/s/UZBZJvyX42X/
wenjuan.com/s/UZBZJvOn9Z/
wenjuan.com/s/UZBZJvSR6kI/
wenjuan.com/s/UZBZJv7oIr/
wenjuan.com/s/UZBZJvd9whZ/
wenjuan.com/s/UZBZJvkeAey/
wenjuan.com/s/UZBZJvzFDy/
wenjuan.com/s/UZBZJvnH3FU/
wenjuan.com/s/UZBZJvwqxYS/
wenjuan.com/s/UZBZJv2N7pL/
wenjuan.com/s/UZBZJv8tmoQ/
wenjuan.com/s/UZBZJv4UC19/
wenjuan.com/s/UZBZJvhWw9/
wenjuan.com/s/UZBZJv2CZyc/
wenjuan.com/s/UZBZJv2qjys/
wenjuan.com/s/UZBZJvXJfm/
wenjuan.com/s/UZBZJvk3Az6/
wenjuan.com/s/UZBZJv4Z0Bh/
wenjuan.com/s/UZBZJvRgLlo/
wenjuan.com/s/UZBZJv7Qpa7/
wenjuan.com/s/UZBZJvFrLxH/
wenjuan.com/s/UZBZJvSprEO/
wenjuan.com/s/UZBZJv5329i/
wenjuan.com/s/UZBZJvNj41T/
wenjuan.com/s/UZBZJvAQQTF/
wenjuan.com/s/UZBZJvKBMu8/
wenjuan.com/s/UZBZJvZrc2h/
wenjuan.com/s/UZBZJvunlW/
wenjuan.com/s/UZBZJvSL8jn/
wenjuan.com/s/UZBZJv28fn/
wenjuan.com/s/UZBZJvaytIp/
wenjuan.com/s/UZBZJv28fn/
wenjuan.com/s/UZBZJvp46hQ/
wenjuan.com/s/UZBZJvQw6AA/
wenjuan.com/s/UZBZJvQgro/
wenjuan.com/s/UZBZJvuxJt/
wenjuan.com/s/UZBZJv2paGc/
wenjuan.com/s/UZBZJvHMZ02/
wenjuan.com/s/UZBZJvN2muJ/
wenjuan.com/s/UZBZJvIJvj/
wenjuan.com/s/UZBZJvebYB0/
wenjuan.com/s/UZBZJvagSUu/
wenjuan.com/s/UZBZJvWxxB/
wenjuan.com/s/UZBZJvEWop/
wenjuan.com/s/UZBZJvzExb/
wenjuan.com/s/UZBZJv6tPf/
wenjuan.com/s/UZBZJvQ2OH2/
wenjuan.com/s/UZBZJvQWQ9N/
wenjuan.com/s/UZBZJvJ9KDZ/
wenjuan.com/s/UZBZJvCqWo/
wenjuan.com/s/UZBZJvlcWS/
wenjuan.com/s/UZBZJvQ5NaB/
wenjuan.com/s/UZBZJvU6SI/
wenjuan.com/s/UZBZJvTKL9/
wenjuan.com/s/UZBZJvKdEd/
wenjuan.com/s/UZBZJvmxn9/
wenjuan.com/s/UZBZJvbzRX/
wenjuan.com/s/UZBZJv9rSg/
wenjuan.com/s/UZBZJvcXqW/
wenjuan.com/s/UZBZJv4fY7a/
wenjuan.com/s/UZBZJvrGY1n/
wenjuan.com/s/UZBZJvZxUb/
wenjuan.com/s/UZBZJvRS2U4/
wenjuan.com/s/UZBZJvxxcB/
wenjuan.com/s/UZBZJvDggvx/
wenjuan.com/s/UZBZJvIpZK6/
wenjuan.com/s/UZBZJvtLbl/
wenjuan.com/s/UZBZJvOq7wd/
wenjuan.com/s/UZBZJvGdQCt/
wenjuan.com/s/UZBZJvLxP2/
wenjuan.com/s/UZBZJvC57Gr/
wenjuan.com/s/UZBZJvX1S7/
wenjuan.com/s/UZBZJvMz2FXc/
wenjuan.com/s/UZBZJvup0A/
wenjuan.com/s/UZBZJvRicMB/
wenjuan.com/s/UZBZJvLFgG/
wenjuan.com/s/UZBZJvPla2H/
wenjuan.com/s/UZBZJvCPnhL/
wenjuan.com/s/UZBZJvppsA9/
wenjuan.com/s/UZBZJv9xbBW/
wenjuan.com/s/UZBZJvAVYf/
wenjuan.com/s/UZBZJvqd2gT/
wenjuan.com/s/UZBZJvxSN8/
wenjuan.com/s/UZBZJvzAb3m/
wenjuan.com/s/UZBZJvbBhGR/
wenjuan.com/s/UZBZJv4BJk/
wenjuan.com/s/UZBZJvjnGZd/
wenjuan.com/s/UZBZJvx6uYi/
wenjuan.com/s/UZBZJvY6QTy/
wenjuan.com/s/UZBZJvJEYW/
wenjuan.com/s/UZBZJv9qNgA/
wenjuan.com/s/UZBZJv4INs/
wenjuan.com/s/UZBZJvprli/
wenjuan.com/s/UZBZJvUSU7T/
wenjuan.com/s/UZBZJvwd1CJ/
wenjuan.com/s/UZBZJvAO1k/
wenjuan.com/s/UZBZJvxzARt/
wenjuan.com/s/UZBZJvslSA/
wenjuan.com/s/UZBZJvtj551/
wenjuan.com/s/UZBZJvrGNha/
wenjuan.com/s/UZBZJvqDQEy/
wenjuan.com/s/UZBZJvNz9cW/
wenjuan.com/s/UZBZJvvoaB/
wenjuan.com/s/UZBZJvIWVkt/
wenjuan.com/s/UZBZJvC8WoU/
wenjuan.com/s/UZBZJvcSdpI/
wenjuan.com/s/UZBZJvzHBRd/
wenjuan.com/s/UZBZJvJT91/
wenjuan.com/s/UZBZJvbryb6/
wenjuan.com/s/UZBZJvx9GiR/
wenjuan.com/s/UZBZJvdydC/
wenjuan.com/s/UZBZJvHqyk/
wenjuan.com/s/UZBZJvGR6S8/
wenjuan.com/s/UZBZJvBKzWg/
wenjuan.com/s/UZBZJvkrOs/
wenjuan.com/s/UZBZJvMsuAL/
wenjuan.com/s/UZBZJvRe6YT/
wenjuan.com/s/UZBZJvEph0t/
wenjuan.com/s/UZBZJvcTHrc/
wenjuan.com/s/UZBZJvYdJbo/
wenjuan.com/s/UZBZJvJpD6X/
wenjuan.com/s/UZBZJvSwEev/
wenjuan.com/s/UZBZJvJH8G/
wenjuan.com/s/UZBZJviJS0i/
wenjuan.com/s/UZBZJvr3mch/
wenjuan.com/s/UZBZJvCOoq8/
wenjuan.com/s/UZBZJvQ18I/
wenjuan.com/s/UZBZJvIuFc/
wenjuan.com/s/UZBZJvB5hiM/
wenjuan.com/s/UZBZJvaluwv/
wenjuan.com/s/UZBZJv6eHHY/
wenjuan.com/s/UZBZJvdxrS/
wenjuan.com/s/UZBZJv5ILjn/
wenjuan.com/s/UZBZJvQi9J/
wenjuan.com/s/UZBZJvOzsnY/
wenjuan.com/s/UZBZJvGbZy/
wenjuan.com/s/UZBZJvB7k6e/
wenjuan.com/s/UZBZJvfO0x/
wenjuan.com/s/UZBZJvjgYK9/
wenjuan.com/s/UZBZJvL1zNA/
wenjuan.com/s/UZBZJvrdTM/
wenjuan.com/s/UZBZJvzltoF/
wenjuan.com/s/UZBZJvgpndD/
wenjuan.com/s/UZBZJviynTK/
wenjuan.com/s/UZBZJvg0N0P/
wenjuan.com/s/UZBZJvP6Sah/
wenjuan.com/s/UZBZJvnvHx/
wenjuan.com/s/UZBZJvgHRc/
wenjuan.com/s/UZBZJvS53P/
wenjuan.com/s/UZBZJvGt2k/
wenjuan.com/s/UZBZJvywHNY/
wenjuan.com/s/UZBZJvVR0x5/
wenjuan.com/s/UZBZJvJJ9X/
wenjuan.com/s/UZBZJv5Xhjv/
wenjuan.com/s/UZBZJvL5uqD/
wenjuan.com/s/UZBZJva4MG/
wenjuan.com/s/UZBZJvGSuw/
wenjuan.com/s/UZBZJvBebAk/
wenjuan.com/s/UZBZJv5d2n/
wenjuan.com/s/UZBZJvUITLr/
wenjuan.com/s/UZBZJvKqjSn/
wenjuan.com/s/UZBZJvrVIS/
wenjuan.com/s/UZBZJvHGAgn/
wenjuan.com/s/UZBZJveLzK/
wenjuan.com/s/UZBZJvBLYt6/
wenjuan.com/s/UZBZJvuuQX/
wenjuan.com/s/UZBZJvXsmAW/
wenjuan.com/s/UZBZJvaiwu/
wenjuan.com/s/UZBZJv9SYT/
wenjuan.com/s/UZBZJvDb7M/
wenjuan.com/s/UZBZJvCcMUM/
wenjuan.com/s/UZBZJvEczhA/
wenjuan.com/s/UZBZJvTEii9/
wenjuan.com/s/UZBZJvPpfQJ/
wenjuan.com/s/UZBZJvopvE/
wenjuan.com/s/UZBZJvdeKz/
wenjuan.com/s/UZBZJvOnoY/
wenjuan.com/s/UZBZJvAivad/
wenjuan.com/s/UZBZJviI4qD/
wenjuan.com/s/UZBZJvQBfF/
wenjuan.com/s/UZBZJvrBCD5/
wenjuan.com/s/UZBZJvEzBY/
wenjuan.com/s/UZBZJvb74t/
wenjuan.com/s/UZBZJvtDkrq/
wenjuan.com/s/UZBZJvPVA1c/
wenjuan.com/s/UZBZJvKb0as/
wenjuan.com/s/UZBZJvOna2/
wenjuan.com/s/UZBZJvpAnH/
wenjuan.com/s/UZBZJvxdpMf/
wenjuan.com/s/UZBZJvx4qmW/
wenjuan.com/s/UZBZJvEBt9/
wenjuan.com/s/UZBZJvCzuL4/
wenjuan.com/s/UZBZJvijsDL/
wenjuan.com/s/UZBZJvV5NS0/
wenjuan.com/s/UZBZJv8BRSU/
wenjuan.com/s/UZBZJvwKdet/
wenjuan.com/s/UZBZJviHm6/
wenjuan.com/s/UZBZJvAJEt/
wenjuan.com/s/UZBZJvTCL4/
wenjuan.com/s/UZBZJvX75em/
wenjuan.com/s/UZBZJvSVCSG/
wenjuan.com/s/UZBZJvCQhha/
wenjuan.com/s/UZBZJv9rq2/
wenjuan.com/s/UZBZJvCQhha/
wenjuan.com/s/UZBZJviizP/
wenjuan.com/s/UZBZJvODHR/
wenjuan.com/s/UZBZJvFn9O/
wenjuan.com/s/UZBZJvE2GH2/
wenjuan.com/s/UZBZJvfgDWc/
wenjuan.com/s/UZBZJvEXBb/
wenjuan.com/s/UZBZJvNY64/
wenjuan.com/s/UZBZJvWkf4N/
wenjuan.com/s/UZBZJv6NgS/
wenjuan.com/s/UZBZJvCs39/
wenjuan.com/s/UZBZJvbRJIM/
wenjuan.com/s/UZBZJvSmIU/
wenjuan.com/s/UZBZJvntVzV/
wenjuan.com/s/UZBZJvse4U/
wenjuan.com/s/UZBZJvvWNHC/
wenjuan.com/s/UZBZJv94aJX/
wenjuan.com/s/UZBZJv1cqxq/
wenjuan.com/s/UZBZJvhYcef/
wenjuan.com/s/UZBZJvJbztm/
wenjuan.com/s/UZBZJvjGLCe/
wenjuan.com/s/UZBZJvWueu9/
wenjuan.com/s/UZBZJv355a1/
wenjuan.com/s/UZBZJvvRUp/
wenjuan.com/s/UZBZJvCTLw7/
wenjuan.com/s/UZBZJv0LqX/
这个文件可以直接运行,测试工具内函数的应用。
词干化和token化
词干化就是把单词提取成词干。逻辑如下:
words =["0rganize","organizes", "organizing"]
stemmed_words =[stem(w) for w in words]
print(stemmed_words)
过程如下图:
token化就是把单词转换成token。
下面这段代码就是测试token化。
a="How long does shipping take?"
print(a)
a = tokenize(a)
print(a)
token化的逻辑大致如下:
编写测试数据
编写json文件intents.json(英文版)
{"intents": [{"tag": "greeting","patterns": ["Hi","Hey","How are you","Is anyone there?","Hello","Good day"],"responses": ["Hey :-)","Hello, thanks for visiting","Hi there, what can I do for you?","Hi there, how can I help?"]},{"tag": "goodbye","patterns": ["Bye", "See you later", "Goodbye"],"responses": ["See you later, thanks for visiting","Have a nice day","Bye! Come back again soon."]},{"tag": "thanks","patterns": ["Thanks", "Thank you", "That's helpful", "Thank's a lot!"],"responses": ["Happy to help!", "Any time!", "My pleasure"]},{"tag": "items","patterns": ["Which items do you have?","What kinds of items are there?","What do you sell?"],"responses": ["We sell coffee and tea","We have coffee and tea"]},{"tag": "payments","patterns": ["Do you take credit cards?","Do you accept Mastercard?","Can I pay with Paypal?","Are you cash only?"],"responses": ["We accept VISA, Mastercard and Paypal","We accept most major credit cards, and Paypal"]},{"tag": "delivery","patterns": ["How long does delivery take?","How long does shipping take?","When do I get my delivery?"],"responses": ["Delivery takes 2-4 days","Shipping takes 2-4 days"]},{"tag": "funny","patterns": ["Tell me a joke!","Tell me something funny!","Do you know a joke?"],"responses": ["Why did the hipster burn his mouth? He drank the coffee before it was cool.","What did the buffalo say when his son left for college? Bison."]}]
}
intents_cn.json中文版数据。
{"intents": [{"tag": "greeting","patterns": ["你好","嗨","您好","有谁在吗?","你好呀","早上好","下午好","晚上好"],"responses": ["你好!有什么我可以帮忙的吗?","您好!感谢您的光临。","嗨!有什么我可以为您效劳的吗?","早上好!今天怎么样?"]},{"tag": "goodbye","patterns": ["再见","拜拜","下次见","保重","晚安"],"responses": ["再见!希望很快能再次见到你。","拜拜!祝你有个愉快的一天。","保重!下次见。","晚安,祝你做个好梦!"]},{"tag": "thanks","patterns": ["谢谢","感谢","多谢","非常感谢"],"responses": ["不客气!很高兴能帮到你。","没问题!随时为您服务。","别客气!希望能帮到您。","很高兴能帮忙!"]},{"tag": "help","patterns": ["你能帮我做什么?","你能做什么?","你能帮助我吗?","我需要帮助","能帮我一下吗?"],"responses": ["我可以帮您回答问题、提供信息,或者进行简单的任务。","我能帮助您查询信息、安排任务等。","您可以问我问题,或者让我做一些简单的事情。","请告诉我您需要的帮助!"]},{"tag": "weather","patterns": ["今天天气怎么样?","今天的天气如何?","天气预报是什么?","外面冷吗?","天气好不好?"],"responses": ["今天的天气很好,适合外出!","今天天气有点冷,记得穿暖和点。","今天天气晴朗,适合去散步。","天气晴,温度适宜,非常适合外出。"]},{"tag": "about","patterns": ["你是什么?","你是谁?","你是做什么的?","你能做些什么?"],"responses": ["我是一个聊天机器人,可以回答您的问题和帮助您解决问题。","我是一个智能助手,帮助您完成各种任务。","我是一个虚拟助手,可以处理简单的任务和查询。","我可以帮助您获取信息,或者做一些简单的任务。"]}]
}
训练数据
训练数据逻辑如下:
import numpy as np
import random
import jsonimport torch
import torch.nn as nn
from torch.utils.data import Dataset, DataLoaderfrom nltk_utils import bag_of_words, tokenize, stem
from model import NeuralNetwith open('intents_cn.json', 'r', encoding='utf-8') as f:intents = json.load(f)all_words = []
tags = []
xy = []
# loop through each sentence in our intents patterns
for intent in intents['intents']:tag = intent['tag']# add to tag listtags.append(tag)for pattern in intent['patterns']:# tokenize each word in the sentencew = tokenize(pattern)# add to our words listall_words.extend(w)# add to xy pairxy.append((w, tag))# stem and lower each word
ignore_words = ['?', '.', '!']
all_words = [stem(w) for w in all_words if w not in ignore_words]
# remove duplicates and sort
all_words = sorted(set(all_words))
tags = sorted(set(tags))print(len(xy), "patterns")
print(len(tags), "tags:", tags)
print(len(all_words), "unique stemmed words:", all_words)# create training data
X_train = []
y_train = []
for (pattern_sentence, tag) in xy:# X: bag of words for each pattern_sentencebag = bag_of_words(pattern_sentence, all_words)X_train.append(bag)# y: PyTorch CrossEntropyLoss needs only class labels, not one-hotlabel = tags.index(tag)y_train.append(label)X_train = np.array(X_train)
y_train = np.array(y_train)# Hyper-parameters
num_epochs = 1000
batch_size = 8
learning_rate = 0.001
input_size = len(X_train[0])
hidden_size = 8
output_size = len(tags)
print(input_size, output_size)class ChatDataset(Dataset):def __init__(self):self.n_samples = len(X_train)self.x_data = X_trainself.y_data = y_train# support indexing such that dataset[i] can be used to get i-th sampledef __getitem__(self, index):return self.x_data[index], self.y_data[index]# we can call len(dataset) to return the sizedef __len__(self):return self.n_samplesdataset = ChatDataset()
train_loader = DataLoader(dataset=dataset,batch_size=batch_size,shuffle=True,num_workers=0)device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')model = NeuralNet(input_size, hidden_size, output_size).to(device)# Loss and optimizer
criterion = nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)# Train the model
for epoch in range(num_epochs):for (words, labels) in train_loader:words = words.to(device)labels = labels.to(dtype=torch.long).to(device)# Forward passoutputs = model(words)# if y would be one-hot, we must apply# labels = torch.max(labels, 1)[1]loss = criterion(outputs, labels)# Backward and optimizeoptimizer.zero_grad()loss.backward()optimizer.step()if (epoch+1) % 100 == 0:print (f'Epoch [{epoch+1}/{num_epochs}], Loss: {loss.item():.4f}')print(f'final loss: {loss.item():.4f}')data = {
"model_state": model.state_dict(),
"input_size": input_size,
"hidden_size": hidden_size,
"output_size": output_size,
"all_words": all_words,
"tags": tags
}FILE = "data.pth"
torch.save(data, FILE)print(f'training complete. file saved to {FILE}')
编写使用聊天
编写使用聊天代码如下:
import random
import jsonimport torchfrom model import NeuralNet
from nltk_utils import bag_of_words, tokenizedevice = torch.device('cuda' if torch.cuda.is_available() else 'cpu')with open('intents_cn.json', 'r',encoding='utf-8') as json_data:intents = json.load(json_data)FILE = "data.pth"
data = torch.load(FILE)input_size = data["input_size"]
hidden_size = data["hidden_size"]
output_size = data["output_size"]
all_words = data['all_words']
tags = data['tags']
model_state = data["model_state"]model = NeuralNet(input_size, hidden_size, output_size).to(device)
model.load_state_dict(model_state)
model.eval()bot_name = "电脑"
print("Let's chat! (type 'quit' to exit)")
while True:# sentence = "do you use credit cards?"sentence = input("我:")if sentence == "quit":breaksentence = tokenize(sentence)X = bag_of_words(sentence, all_words)X = X.reshape(1, X.shape[0])X = torch.from_numpy(X).to(device)output = model(X)_, predicted = torch.max(output, dim=1)tag = tags[predicted.item()]probs = torch.softmax(output, dim=1)prob = probs[0][predicted.item()]if prob.item() > 0.75:for intent in intents['intents']:if tag == intent["tag"]:print(f"{bot_name}: {random.choice(intent['responses'])}")else:print(f"{bot_name}: 我不知道")
运行效果如下:
传送门:
零基础学习人工智能—Python—Pytorch学习—全集
注:此文章为原创,任何形式的转载都请联系作者获得授权并注明出处!
若您觉得这篇文章还不错,请点击下方的【推荐】,非常感谢!
https://www.cnblogs.com/kiba/p/18610399