Skip to content
GitLab
Explore
Projects
Groups
Topics
Snippets
Projects
Groups
Topics
Snippets
/
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
Menu
proekt
obuch
Commits
7af7e8aa
Commit
7af7e8aa
authored
1 week ago
by
Мазур Грета Евгеньевна
Browse files
Options
Download
Patches
Plain Diff
obuch with cross and graphic SAVING LORA
parent
20b6e963
master
No related merge requests found
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
.ipynb_checkpoints/testingkek-checkpoint.py
+16
-1
.ipynb_checkpoints/testingkek-checkpoint.py
testingkek.py
+16
-1
testingkek.py
with
32 additions
and
2 deletions
+32
-2
.ipynb_checkpoints/testingkek-checkpoint.py
+
16
−
1
View file @
7af7e8aa
...
...
@@ -58,4 +58,19 @@ with torch.no_grad():
attention_mask
=
inputs
[
'attention_mask'
],
token_type_ids
=
inputs
.
get
(
'token_type_ids'
,
None
)
# Обработка необязательного аргумента
)
print
(
"
\n
Тестовый вывод:"
,
{
k
:
v
.
shape
for
k
,
v
in
outputs
.
items
()})
\ No newline at end of file
print
(
"
\n
Тестовый вывод:"
,
{
k
:
v
.
shape
for
k
,
v
in
outputs
.
items
()})
# Проверка загруженных LoRA параметров
print
(
"
\n
=== Проверка LoRA ==="
)
lora_params
=
[
name
for
name
,
_
in
model
.
named_parameters
()
if
'lora'
in
name
]
if
lora_params
:
print
(
f
"✅ LoRA загружено! Найдено
{
len
(
lora_params
)
}
параметров"
)
print
(
"Примеры параметров:"
,
lora_params
[:
4
])
else
:
print
(
"❌ LoRA не загружено!"
)
# Проверка влияния LoRA
base_output
=
base_model
(
**
inputs
)
# Без LoRA
lora_output
=
model
.
bert
(
**
inputs
)
# С LoRA
diff
=
(
lora_output
.
last_hidden_state
-
base_output
.
last_hidden_state
).
abs
().
mean
()
print
(
f
"
\n
Среднее изменение выхода BERT (должно быть >0):
{
diff
.
item
()
:
.
4
f
}
"
)
\ No newline at end of file
This diff is collapsed.
Click to expand it.
testingkek.py
+
16
−
1
View file @
7af7e8aa
...
...
@@ -58,4 +58,19 @@ with torch.no_grad():
attention_mask
=
inputs
[
'attention_mask'
],
token_type_ids
=
inputs
.
get
(
'token_type_ids'
,
None
)
# Обработка необязательного аргумента
)
print
(
"
\n
Тестовый вывод:"
,
{
k
:
v
.
shape
for
k
,
v
in
outputs
.
items
()})
\ No newline at end of file
print
(
"
\n
Тестовый вывод:"
,
{
k
:
v
.
shape
for
k
,
v
in
outputs
.
items
()})
# Проверка загруженных LoRA параметров
print
(
"
\n
=== Проверка LoRA ==="
)
lora_params
=
[
name
for
name
,
_
in
model
.
named_parameters
()
if
'lora'
in
name
]
if
lora_params
:
print
(
f
"✅ LoRA загружено! Найдено
{
len
(
lora_params
)
}
параметров"
)
print
(
"Примеры параметров:"
,
lora_params
[:
4
])
else
:
print
(
"❌ LoRA не загружено!"
)
# Проверка влияния LoRA
base_output
=
base_model
(
**
inputs
)
# Без LoRA
lora_output
=
model
.
bert
(
**
inputs
)
# С LoRA
diff
=
(
lora_output
.
last_hidden_state
-
base_output
.
last_hidden_state
).
abs
().
mean
()
print
(
f
"
\n
Среднее изменение выхода BERT (должно быть >0):
{
diff
.
item
()
:
.
4
f
}
"
)
\ No newline at end of file
This diff is collapsed.
Click to expand it.
Write
Preview
Supports
Markdown
0%
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment
Menu
Explore
Projects
Groups
Topics
Snippets