书生·浦语大模型使用
本篇介绍书生·浦语大模型的使用,包括智能对话、智能体工具调用和图文理解创作等。
InternLM-Chat-7B 智能对话
环境配置
python
1 | !conda create --name internlm-chat --clone=/root/share/conda_envs/internlm-base |
python
1 | %pip install -q --upgrade pip |
python
1 | # 查看当前依赖包信息 |
Package Version
------------------------- ------------
accelerate 0.24.1
addict 2.4.0
aiohttp 3.9.1
aiosignal 1.3.1
aliyun-python-sdk-core 2.14.0
aliyun-python-sdk-kms 2.16.2
altair 5.2.0
asttokens 2.4.1
async-timeout 4.0.3
attrs 23.2.0
blinker 1.7.0
Brotli 1.0.9
cachetools 5.3.2
certifi 2023.11.17
cffi 1.16.0
charset-normalizer 2.0.4
click 8.1.7
comm 0.2.1
crcmod 1.7
cryptography 41.0.3
datasets 2.13.0
debugpy 1.8.0
decorator 5.1.1
dill 0.3.6
einops 0.7.0
exceptiongroup 1.2.0
executing 2.0.1
filelock 3.13.1
frozenlist 1.4.1
fsspec 2023.12.2
gast 0.5.4
gitdb 4.0.11
GitPython 3.1.40
gmpy2 2.1.2
huggingface-hub 0.20.2
idna 3.4
importlib-metadata 6.11.0
ipykernel 6.28.0
ipython 8.19.0
ipywidgets 8.1.1
jedi 0.19.1
Jinja2 3.1.2
jmespath 0.10.0
jsonschema 4.20.0
jsonschema-specifications 2023.12.1
jupyter_client 8.6.0
jupyter_core 5.7.0
jupyterlab-widgets 3.0.9
markdown-it-py 3.0.0
MarkupSafe 2.1.1
matplotlib-inline 0.1.6
mdurl 0.1.2
mkl-fft 1.3.8
mkl-random 1.2.4
mkl-service 2.4.0
modelscope 1.9.5
mpmath 1.3.0
multidict 6.0.4
multiprocess 0.70.14
nest-asyncio 1.5.8
networkx 3.1
numpy 1.26.2
oss2 2.18.4
packaging 23.2
pandas 2.1.4
parso 0.8.3
pexpect 4.9.0
Pillow 9.5.0
pip 23.3.2
platformdirs 4.1.0
prompt-toolkit 3.0.43
protobuf 4.25.1
psutil 5.9.7
ptyprocess 0.7.0
pure-eval 0.2.2
pyarrow 14.0.2
pycparser 2.21
pycryptodome 3.19.1
pydeck 0.8.1b0
Pygments 2.17.2
Pympler 1.0.1
pyOpenSSL 23.2.0
PySocks 1.7.1
python-dateutil 2.8.2
pytz 2023.3.post1
pytz-deprecation-shim 0.1.0.post0
PyYAML 6.0.1
pyzmq 25.1.2
referencing 0.32.1
regex 2023.12.25
requests 2.31.0
rich 13.7.0
rpds-py 0.16.2
safetensors 0.4.1
scipy 1.11.4
sentencepiece 0.1.99
setuptools 68.0.0
simplejson 3.19.2
six 1.16.0
smmap 5.0.1
sortedcontainers 2.4.0
stack-data 0.6.3
streamlit 1.24.0
sympy 1.11.1
tenacity 8.2.3
tokenizers 0.15.0
toml 0.10.2
tomli 2.0.1
toolz 0.12.0
torch 2.0.1
torchaudio 2.0.2
torchvision 0.15.2
tornado 6.4
tqdm 4.66.1
traitlets 5.14.1
transformers 4.35.2
triton 2.0.0
typing_extensions 4.7.1
tzdata 2023.4
tzlocal 4.3.1
urllib3 1.26.18
validators 0.22.0
watchdog 3.0.0
wcwidth 0.2.13
wheel 0.41.2
widgetsnbextension 4.0.9
xxhash 3.4.1
yapf 0.40.2
yarl 1.9.4
zipp 3.17.0
Note: you may need to restart the kernel to use updated packages.
下载模型
python
1 | %mkdir -p /root/model/Shanghai_AI_Laboratory |
代码准备
python
1 | %mkdir /root/code |
/root/code
Cloning into 'InternLM'...
/root/.conda/envs/internlm-chat/lib/python3.10/site-packages/IPython/core/magics/osm.py:417: UserWarning: using dhist requires you to install the `pickleshare` library.
self.shell.db['dhist'] = compress_dhist(dhist)[-100:]
remote: Enumerating objects: 2604, done.
remote: Counting objects: 100% (592/592), done.
remote: Compressing objects: 100% (264/264), done.
remote: Total 2604 (delta 324), reused 581 (delta 318), pack-reused 2012
Receiving objects: 100% (2604/2604), 4.87 MiB | 793.00 KiB/s, done.
Resolving deltas: 100% (1608/1608), done.
python
1 | %cd InternLM |
/root/code/InternLM
Note: switching to '3028f07cb79e5b1d7342f4ad8d11efad3fd13d17'.
You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by switching back to a branch.
If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -c with the switch command. Example:
git switch -c <new-branch-name>
Or undo this operation with:
git switch -
Turn off this advice by setting config variable advice.detachedHead to false
HEAD is now at 3028f07 fix(readme): update README with original weight download link (#460)
终端运行
python
1 | # %%writefile /root/code/InternLM/cli_demo.py |
Loading checkpoint shards: 0%| | 0/8 [00:00<?, ?it/s]
=============Welcome to InternLM chatbot, type 'exit' to exit.=============
User >>> 你好
robot >>> 你好!有什么我能帮你的吗?
User >>> 帮我写个 300 字的小故事
robot >>> 有一天,一个年轻的男孩来到了一个神秘的森林。他不知道这个森林里到底有什么,但是他知道,他必须进去。他跟着森林里的指引走了一段路,最终来到了一座古老的神庙。
当男孩走到神庙门前时,他感到一种奇怪的气息。他推开门,走进神庙。他发现里面有一个神秘的书房,里面放着许多古老的书籍和文物。他拿起一本破旧的书籍,翻到第一页,上面写着一个故事。
故事讲述了一个叫做“神灯”的神秘物品。男孩被这个故事吸引了,他开始阅读这本书,希望能够找到这个物品。他读了很长时间,最终理解了这个故事,明白了神灯的含义。
男孩离开了神庙,回到了家中。他开始思考,他是否应该去寻找神灯。他决定去寻找这个神秘的物品,他相信自己可以找到它。
他走了很长一段路,遇到了许多挑战和困难,但是他始终没有放弃。最终,他找到了神灯。他拿起这个神秘物品,开始使用它。
他发现,神灯可以帮助他实现他的梦想。他开始使用神灯,他能够做任何事情。他能够变成任何他想要的形状,他能够变得非常有力量。
然而,男孩开始意识到,神灯并不是一个简单的物品。他开始发现,他必须付出一些代价才能使用神灯。他必须做很多事情,才能够使用它。他开始思考,他是否愿意付出这个代价。
最终,男孩决定使用神灯,他开始使用它来帮助那些需要帮助的人。他开始变得非常有名,人们开始称呼他为“神灯使者”。
男孩意识到,他必须继续使用神灯,但是他必须小心使用它。他不能让神灯的力量控制他,他必须控制它。最终,男孩学会了如何使用神灯,他成为了一个真正的神灯使者。
男孩回到了神庙,他感谢神庙,感谢神灯,感谢那个神秘的物品。他知道,他必须继续使用神灯,但是必须小心使用它,不能让它控制他。
男孩回到了家中,他开始使用神灯来帮助那些需要帮助的人。他成为了一个真正的神灯使者,他的故事被传颂了下去,成为了一段神话。
User >>> exit
WEB 端运行
python
1 | %%writefile /root/code/InternLM/web_demo_user.py |
Writing /root/code/InternLM/web_demo_user.py
python
1 | import os, sys |
python
1 | %cd /root/code/InternLM/ |
/root/code/InternLM
/root/.conda/envs/internlm-chat/lib/python3.10/site-packages/IPython/core/magics/osm.py:417: UserWarning: using dhist requires you to install the `pickleshare` library.
self.shell.db['dhist'] = compress_dhist(dhist)[-100:]
Collecting usage statistics. To deactivate, set browser.gatherUsageStats to False.
You can now view your Streamlit app in your browser.
URL: http://127.0.0.1:6006
load model begin.
Loading checkpoint shards: 100%|██████████████████| 8/8 [00:25<00:00, 3.20s/it]
/usr/local/lib/python3.8/dist-packages/huggingface_hub/file_download.py:983: UserWarning: Not enough free disk space to download the file. The expected file size is: 0.00 MB. The target location /root/.cache/huggingface/hub only has 0.00 MB free disk space.
warnings.warn(
/usr/local/lib/python3.8/dist-packages/huggingface_hub/file_download.py:983: UserWarning: Not enough free disk space to download the file. The expected file size is: 0.00 MB. The target location /root/.cache/huggingface/hub/models--internlm--internlm-chat-7b/blobs only has 0.00 MB free disk space.
warnings.warn(
tokenizer_config.json: 343B [00:00, 31.9kB/s]
/usr/local/lib/python3.8/dist-packages/huggingface_hub/file_download.py:983: UserWarning: Not enough free disk space to download the file. The expected file size is: 0.01 MB. The target location /root/.cache/huggingface/hub only has 0.00 MB free disk space.
warnings.warn(
/usr/local/lib/python3.8/dist-packages/huggingface_hub/file_download.py:983: UserWarning: Not enough free disk space to download the file. The expected file size is: 0.01 MB. The target location /root/.cache/huggingface/hub/models--internlm--internlm-chat-7b/blobs only has 0.00 MB free disk space.
warnings.warn(
tokenization_internlm.py: 8.95kB [00:00, 35.6MB/s]
A new version of the following files was downloaded from https://huggingface.co/internlm/internlm-chat-7b:
- tokenization_internlm.py
. Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision.
/usr/local/lib/python3.8/dist-packages/huggingface_hub/file_download.py:983: UserWarning: Not enough free disk space to download the file. The expected file size is: 1.66 MB. The target location /root/.cache/huggingface/hub only has 0.00 MB free disk space.
warnings.warn(
/usr/local/lib/python3.8/dist-packages/huggingface_hub/file_download.py:983: UserWarning: Not enough free disk space to download the file. The expected file size is: 1.66 MB. The target location /root/.cache/huggingface/hub/models--internlm--internlm-chat-7b/blobs only has 0.00 MB free disk space.
warnings.warn(
tokenizer.model: 100%|█████████████████████| 1.66M/1.66M [00:00<00:00, 2.92MB/s]
special_tokens_map.json: 95.0B [00:00, 18.2kB/s]
load model end.
load model begin.
load model end.
load model begin.
load model end.
^C
Stopping...
上面运行成功后,需要利用SSH把远程服务器的6006端口映射到本地的某个端口(这里设为6006),这里的SSH端口根据开发机创建时动态开启的端口的不同而不同,我这里是 33449
bash
1 | # 本地机器运行,需要把本地的公钥添加到个人服务器账号上 |
Lagent 智能体工具调用
Lagent 是一个轻量级、开源的基于大语言模型的智能体(agent)框架,支持用户快速地将一个大语言模型转变为多种类型的智能体,并提供了一些典型工具为大语言模型赋能。通过 Lagent 框架可以更好的发挥 InternLM 的全部性能。
环境准备
同上一个 InternLM-Chat-7B 智能对话的环境
下载模型
同上一个 InternLM-Chat-7B 智能对话的模型
Lagent 安装
python
1 | %cd /root/code |
/root/code
Cloning into 'lagent'...
remote: Enumerating objects: 414, done.
remote: Counting objects: 100% (414/414), done.
remote: Compressing objects: 100% (188/188), done.
remote: Total 414 (delta 197), reused 414 (delta 197), pack-reused 0
Receiving objects: 100% (414/414), 214.97 KiB | 306.00 KiB/s, done.
Resolving deltas: 100% (197/197), done.
python
1 | %cd /root/code/lagent |
/root/code/lagent
Note: switching to '511b03889010c4811b1701abb153e02b8e94fb5e'.
You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by switching back to a branch.
If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -c with the switch command. Example:
git switch -c <new-branch-name>
Or undo this operation with:
git switch -
Turn off this advice by setting config variable advice.detachedHead to false
HEAD is now at 511b038 update header-logo (#72)
推理
python
1 | %%writefile /root/code/lagent/examples/react_web_demo_user.py |
Writing /root/code/lagent/examples/react_web_demo_user.py
python
1 | import sys, os |
python
1 | !HF_ENDPOINT=https://hf-mirror.com {os.path.join(sys.exec_prefix, 'bin/streamlit')} run /root/code/lagent/examples/react_web_demo_user.py --server.address 127.0.0.1 --server.port 6006 |
Collecting usage statistics. To deactivate, set browser.gatherUsageStats to False.
You can now view your Streamlit app in your browser.
URL: http://127.0.0.1:6006
A new version of Streamlit is available.
See what's new at https://discuss.streamlit.io/c/announcements
Enter the following command to upgrade:
$pip install streamlit --upgrade
Loading checkpoint shards: 100%|██████████████████| 8/8 [00:14<00:00, 1.83s/it]
2024-01-06 15:44:48.261 [{'role': 'user', 'content': '你好'}, {'role': 'assistant', 'content': 'Thought: 你好,有什么我可以帮助你的吗?\nAction: PythonInterpreter\nAction Input: def solution():\n answer = "你好"\n return answer'}, {'role': 'system', 'content': 'Response:你好\n'}, {'role': 'assistant', 'content': 'Thought: 我已经收到了你的消息,你需要我做什么?\nAction: PythonInterpreter\nAction Input: def solution():\n answer = "你好"\n return answer'}, {'role': 'system', 'content': 'Response:你好\n'}, {'role': 'assistant', 'content': 'Thought: 我已经收到了你的消息,但是你的消息中没有提供任何需要我帮助的问题或指令。请重新发送你的问题或指令,我会尽力帮助你。\nAction: PythonInterpreter\nAction Input: def solution():\n answer = "你好"\n return answer'}, {'role': 'system', 'content': 'Response:你好\n'}, {'role': 'assistant', 'content': 'Thought: 我已经收到了你的消息,但是你的消息中没有提供任何需要我帮助的问题或指令。请重新发送你的问题或指令,我会尽力帮助你。\nAction: PythonInterpreter\nAction Input: def solution():\n answer = "你好"\n return answer'}, {'role': 'system', 'content': 'Response:你好\n'}]
2024-01-06 15:45:52.619 [{'role': 'user', 'content': '帮我求解:4x + 3y = 7; x + y = 2,x 和 y 分别等于多少'}, {'role': 'assistant', 'content': 'Thought: 这是一道方程求解题,需要用到求解器Solver来求解这个方程组。\nAction: PythonInterpreter\nAction Input: def solution():\n\tfrom sympy import symbols, Eq, solve, E, E\n\tx = symbols("x", real=True)\n\ty = symbols("y", real=True)\n\tresult = solve([Eq(4*x + 3*y, 7), Eq(x + y, 2)], [x, y], dict=True)\n\treturn result'}, {'role': 'system', 'content': 'Response:[{x: 1, y: 1}]\n'}, {'role': 'assistant', 'content': 'Thought: Base on the result of the code, the answer is:\nFinal Answer: 根据求解器的计算结果,x 的值为 1,y 的值为 1。'}]
^C
Stopping...
上面运行成功后,需要利用SSH把远程服务器的6006端口映射到本地的某个端口(这里设为6006),这里的SSH端口根据开发机创建时动态开启的端口的不同而不同,我这里是 33449
bash
1 | # 本地机器运行,需要把本地的公钥添加到个人服务器账号上 |
浦语·灵笔图文理解创作
internlm-xcomposer-7b 模型部署一个图文理解创作
环境准备
python
1 | # 基本环境使用与 第一部分相同的,另外,需要安装特定的几个依赖包 |
模型下载
python
1 | %mkdir -p /root/model/Shanghai_AI_Laboratory |
推理
python
1 | %cd /root/code |
/root/code
Cloning into 'InternLM-XComposer'...
/root/.conda/envs/internlm-chat/lib/python3.10/site-packages/IPython/core/magics/osm.py:417: UserWarning: using dhist requires you to install the `pickleshare` library.
self.shell.db['dhist'] = compress_dhist(dhist)[-100:]
remote: Enumerating objects: 680, done.
remote: Counting objects: 100% (680/680), done.
remote: Compressing objects: 100% (273/273), done.
remote: Total 680 (delta 361), reused 680 (delta 361), pack-reused 0
Receiving objects: 100% (680/680), 10.74 MiB | 8.78 MiB/s, done.
Resolving deltas: 100% (361/361), done.
/root/code/InternLM-XComposer
Note: switching to '3e8c79051a1356b9c388a6447867355c0634932d'.
You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by switching back to a branch.
If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -c with the switch command. Example:
git switch -c <new-branch-name>
Or undo this operation with:
git switch -
Turn off this advice by setting config variable advice.detachedHead to false
HEAD is now at 3e8c790 add polar in readme
python
1 | import sys |
python
1 | %cd /root/code/InternLM-XComposer |
/root/code/InternLM-XComposer
/root/.conda/envs/internlm-chat/lib/python3.10/site-packages/IPython/core/magics/osm.py:417: UserWarning: using dhist requires you to install the `pickleshare` library.
self.shell.db['dhist'] = compress_dhist(dhist)[-100:]
Init VIT ... Done
Init Perceive Sampler ... Done
Init InternLM ... Done
Loading checkpoint shards: 100%|██████████████████| 4/4 [00:25<00:00, 6.37s/it]
load model done: <class 'transformers_modules.internlm-xcomposer-7b.modeling_InternLM_XComposer.InternLMXComposerForCausalLM'>
/root/code/InternLM-XComposer/examples/web_demo.py:1068: GradioDeprecationWarning: The `style` method is deprecated. Please set these arguments in the constructor instead.
chat_textbox = gr.Textbox(
Running on local URL: http://0.0.0.0:6006
init
Could not create share link. Missing file: /root/.conda/envs/internlm-chat/lib/python3.10/site-packages/gradio/frpc_linux_amd64_v0.2.
Please check your internet connection. This can happen if your antivirus software blocks the download of this file. You can install manually by following these steps:
1. Download this file: https://cdn-media.huggingface.co/frpc-gradio-0.2/frpc_linux_amd64
2. Rename the downloaded file to: frpc_linux_amd64_v0.2
3. Move the file to this location: /root/.conda/envs/internlm-chat/lib/python3.10/site-packages/gradio
<object object at 0x7efdf06ec340>
郁金香(学名:Tulipa gesneriana L.)是百合科郁金香属植物,又名洋荷花、草麝香等。原产于地中海沿岸以及西亚和南西伯利亚的半干旱或高寒地区。荷兰人最早将郁金香作为观赏花卉;16世纪中叶,郁金香被引入中国;17世纪传入欧洲各国。
郁金香花色丰富,有红、橙、黄、紫、白、黑、双色及镶边等多种颜色,而且同一植株上可呈现不同色彩的花朵。花朵硕大艳丽,富丽堂皇,芳香四溢,给人以庄重、华贵、富丽之感。它不仅具有很高的观赏价值,而且还有较高的经济作物品种开发利用价值。
<TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1>
郁金香(学名:Tulipa gesneriana L.)是百合科郁金香属植物,又名洋荷花、草麝香等。原产于地中海沿岸以及西亚和南西伯利亚的半干旱或高寒地区。荷兰人最早将郁金香作为观赏花卉;16世纪中叶,郁金香被引入中国;17世纪传入欧洲各国。
郁金香花色丰富,有红、橙、黄、紫、白、黑、双色及镶边等多种颜色,而且同一植株上可呈现不同色彩的花朵。花朵硕大艳丽,富丽堂皇,芳香四溢,给人以庄重、华贵、富丽之感。它不仅具有很高的观赏价值,而且还有较高的经济作物品种开发利用价值。
<TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1>
适合插入图像的行是<Seg0>, <Seg2>.
[0, 2]
郁金香的花朵,颜色丰富多样。
郁金香的花朵,花朵硕大艳丽。
{0: '郁金香的花朵,颜色丰富多样。', 2: '郁金香的花朵,花朵硕大艳丽。'}
{0: '郁金香的花朵,颜色丰富多样。', 2: '郁金香的花朵,花朵硕大艳丽。'}
https://static.openxlab.org.cn/lingbi/jpg-images/105d05c6bc63e3f446c715f10b1c5bb349e09c1e2860fa2d510d0aabde193a1a.jpg
download image with url
image downloaded
https://static.openxlab.org.cn/lingbi/jpg-images/11eba488365ed9c830601ab473788c82b6fda279d05c2485227ee8cb089b2f51.jpg
download image with url
image downloaded
model_select_image
0 郁金香(学名:Tulipa gesneriana L.)是百合科郁金香属植物,又名洋荷花、草麝香等。原产于地中海沿岸以及西亚和南西伯利亚的半干旱或高寒地区。荷兰人最早将郁金香作为观赏花卉;16世纪中叶,郁金香被引入中国;17世纪传入欧洲各国。
<div align="center"> <img src="file=articles/如何培育郁金香/temp_1000_0.png" width = 500/> </div>
1 郁金香花色丰富,有红、橙、黄、紫、白、黑、双色及镶边等多种颜色,而且同一植株上可呈现不同色彩的花朵。花朵硕大艳丽,富丽堂皇,芳香四溢,给人以庄重、华贵、富丽之感。它不仅具有很高的观赏价值,而且还有较高的经济作物品种开发利用价值。
2 <TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1><TOKENS_UNUSED_1>
<div align="center"> <img src="file=articles/如何培育郁金香/temp_1002_2.png" width = 500/> </div>
^C
Keyboard interruption in main thread... closing server.
上面运行成功后,需要利用SSH把远程服务器的6006端口映射到本地的某个端口(这里设为6006),这里的SSH端口根据开发机创建时动态开启的端口的不同而不同,我这里是 34000
bash
1 | # 本地机器运行,需要把本地的公钥添加到个人服务器账号上 |
从 hugging face 下载 InternLM-20b 的 config.json
熟悉 hugging face 下载功能,使用 huggingface_hub python 包,下载 InternLM-20B 的 config.json 文件到本地(需截图下载过程)。
python
1 | %pip install -U huggingface_hub |
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple
Requirement already satisfied: huggingface_hub in /root/.conda/envs/internlm-chat/lib/python3.10/site-packages (0.20.2)
Requirement already satisfied: filelock in /root/.conda/envs/internlm-chat/lib/python3.10/site-packages (from huggingface_hub) (3.13.1)
Requirement already satisfied: fsspec>=2023.5.0 in /root/.conda/envs/internlm-chat/lib/python3.10/site-packages (from huggingface_hub) (2023.12.2)
Requirement already satisfied: requests in /root/.conda/envs/internlm-chat/lib/python3.10/site-packages (from huggingface_hub) (2.31.0)
Requirement already satisfied: tqdm>=4.42.1 in /root/.conda/envs/internlm-chat/lib/python3.10/site-packages (from huggingface_hub) (4.66.1)
Requirement already satisfied: pyyaml>=5.1 in /root/.conda/envs/internlm-chat/lib/python3.10/site-packages (from huggingface_hub) (6.0.1)
Requirement already satisfied: typing-extensions>=3.7.4.3 in /root/.conda/envs/internlm-chat/lib/python3.10/site-packages (from huggingface_hub) (4.9.0)
Requirement already satisfied: packaging>=20.9 in /root/.conda/envs/internlm-chat/lib/python3.10/site-packages (from huggingface_hub) (23.2)
Requirement already satisfied: charset-normalizer<4,>=2 in /root/.conda/envs/internlm-chat/lib/python3.10/site-packages (from requests->huggingface_hub) (2.0.4)
Requirement already satisfied: idna<4,>=2.5 in /root/.conda/envs/internlm-chat/lib/python3.10/site-packages (from requests->huggingface_hub) (3.4)
Requirement already satisfied: urllib3<3,>=1.21.1 in /root/.conda/envs/internlm-chat/lib/python3.10/site-packages (from requests->huggingface_hub) (1.26.18)
Requirement already satisfied: certifi>=2017.4.17 in /root/.conda/envs/internlm-chat/lib/python3.10/site-packages (from requests->huggingface_hub) (2023.11.17)
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
Note: you may need to restart the kernel to use updated packages.
python
1 | import sys, os |
python
1 | !HF_ENDPOINT=https://hf-mirror.com {os.path.join(sys.exec_prefix, 'bin/huggingface-cli')} download --resume-download internlm/internlm-20b config.json --local-dir . |
Consider using `hf_transfer` for faster downloads. This solution comes with some limitations. See https://huggingface.co/docs/huggingface_hub/hf_transfer for more details.
./config.json
python
1 | import json |
python
1 | with open('./config.json', 'r') as jf: |
{'architectures': ['InternLMForCausalLM'],
'auto_map': {'AutoConfig': 'configuration_internlm.InternLMConfig',
'AutoModel': 'modeling_internlm.InternLMForCausalLM',
'AutoModelForCausalLM': 'modeling_internlm.InternLMForCausalLM'},
'bias': False,
'bos_token_id': 1,
'eos_token_id': 2,
'hidden_act': 'silu',
'hidden_size': 5120,
'initializer_range': 0.02,
'intermediate_size': 13824,
'max_position_embeddings': 4096,
'model_type': 'internlm',
'num_attention_heads': 40,
'num_hidden_layers': 60,
'num_key_value_heads': 40,
'pad_token_id': 2,
'pretraining_tp': 1,
'rms_norm_eps': 1e-06,
'rope_scaling': None,
'rope_theta': 10000.0,
'tie_word_embeddings': False,
'torch_dtype': 'bfloat16',
'transformers_version': '4.33.1',
'use_cache': True,
'vocab_size': 103168,
'rotary': {'base': 10000, 'type': 'dynamic'}}
参考文献
本博客所有文章除特别声明外,均采用 CC BY-NC-SA 4.0 许可协议。转载请注明来自 J. Xu!
评论