Skip to content
GitLab
Explore
Projects
Groups
Topics
Snippets
Projects
Groups
Topics
Snippets
/
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
Menu
proekt
obuch
Commits
ae43c80d
Commit
ae43c80d
authored
6 days ago
by
Мазур Грета Евгеньевна
Browse files
Options
Download
Patches
Plain Diff
micro zapusk no cross
parent
28456873
master
No related merge requests found
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
.ipynb_checkpoints/micro_no_cross-checkpoint.py
+2
-1
.ipynb_checkpoints/micro_no_cross-checkpoint.py
micro_no_cross.py
+2
-1
micro_no_cross.py
with
4 additions
and
2 deletions
+4
-2
.ipynb_checkpoints/micro_no_cross-checkpoint.py
+
2
−
1
View file @
ae43c80d
...
...
@@ -103,6 +103,7 @@ class MultiTaskBert(BertPreTrainedModel):
# Создание модели
base_model
=
MultiTaskBert
.
from_pretrained
(
'bert-base-uncased'
).
to
(
device
)
base_model
.
save_pretrained
(
'./micro_no_cross_fine_tuned/base'
)
# Сохраняет модель и её веса
# Настройка LoRA.
# Явно исключаем сохранение модулей, не адаптированных LoRA (например, классификаторов),
...
...
@@ -179,7 +180,7 @@ plt.show()
# trainer.save_model('./fine-tuned-bert-lora_new')
# tokenizer.save_pretrained('./fine-tuned-bert-lora_new')
# Сохранение модели, адаптеров LoRA и токенизатора
base_model
.
save_pretrained
(
'./micro_no_cross_fine_tuned/base'
)
# Сохраняет модель и её веса
#
base_model.save_pretrained('./micro_no_cross_fine_tuned/base') # Сохраняет модель и её веса
tokenizer
.
save_pretrained
(
'./micro_no_cross_fine_tuned'
)
# Сохраняет токенизатор
# model.save_pretrained("./micro_no_cross_fine_tuned")
model
.
save_pretrained
(
"./micro_no_cross_fine_tuned/lora"
)
...
...
This diff is collapsed.
Click to expand it.
micro_no_cross.py
+
2
−
1
View file @
ae43c80d
...
...
@@ -103,6 +103,7 @@ class MultiTaskBert(BertPreTrainedModel):
# Создание модели
base_model
=
MultiTaskBert
.
from_pretrained
(
'bert-base-uncased'
).
to
(
device
)
base_model
.
save_pretrained
(
'./micro_no_cross_fine_tuned/base'
)
# Сохраняет модель и её веса
# Настройка LoRA.
# Явно исключаем сохранение модулей, не адаптированных LoRA (например, классификаторов),
...
...
@@ -179,7 +180,7 @@ plt.show()
# trainer.save_model('./fine-tuned-bert-lora_new')
# tokenizer.save_pretrained('./fine-tuned-bert-lora_new')
# Сохранение модели, адаптеров LoRA и токенизатора
base_model
.
save_pretrained
(
'./micro_no_cross_fine_tuned/base'
)
# Сохраняет модель и её веса
#
base_model.save_pretrained('./micro_no_cross_fine_tuned/base') # Сохраняет модель и её веса
tokenizer
.
save_pretrained
(
'./micro_no_cross_fine_tuned'
)
# Сохраняет токенизатор
# model.save_pretrained("./micro_no_cross_fine_tuned")
model
.
save_pretrained
(
"./micro_no_cross_fine_tuned/lora"
)
...
...
This diff is collapsed.
Click to expand it.
Write
Preview
Supports
Markdown
0%
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment
Menu
Explore
Projects
Groups
Topics
Snippets