Files
ddpo-pytorch/config/__pycache__/base.cpython-310.pyc

16 lines
1.2 KiB
Plaintext
Raw Normal View History

2023-06-23 19:25:54 -07:00
o
oO<6F>d<EFBFBD><00>@sddlZdd<03>ZdS)<04>NcCst<00><01>}d|_d|_d|_d|_d|_t<00><01>|_}d|_d|_ t<00><01>|_
}d|_d|_ d |_ d |_ d
|_d |_d |_d |_d|_d|_d|_d|_d|_d|_d
|_t<00><01>|_}d|_d|_d|_d|_ d|_d|_i|_d|_ t<00><01>|_!d|j!_"d|j!_#|S)N<>*<00>logs<67>d<00>fp16Tzrunwayml/stable-diffusion-v1-5<>main<69>Fg-C<1C><>6?g<><67><EFBFBD><EFBFBD><EFBFBD><EFBFBD><EFBFBD>?g+<2B><16><><EFBFBD><EFBFBD>?g{<14>G<EFBFBD>z<EFBFBD>?g:<3A>0<EFBFBD><30>yE>g<00>?<3F>
<00>g@<40><00>imagenet_animals<6C>jpeg_compressibility<74><79><00>)$<24>ml_collections<6E>
ConfigDict<EFBFBD>seed<65>logdir<69>
num_epochs<EFBFBD>mixed_precision<6F>
allow_tf32<EFBFBD>
pretrained<EFBFBD>model<65>revision<6F>train<69>
batch_size<EFBFBD> use_8bit_adam<61>scale_lr<6C> learning_rate<74>
adam_beta1<EFBFBD>
adam_beta2<EFBFBD>adam_weight_decay<61> adam_epsilon<6F>gradient_accumulation_steps<70> max_grad_norm<72>num_inner_epochs<68>cfg<66> adv_clip_max<61>
clip_range<EFBFBD>sample<6C> num_steps<70>eta<74>guidance_scale<6C>num_batches_per_epoch<63> prompt_fn<66>prompt_fn_kwargs<67> reward_fn<66>per_prompt_stat_tracking<6E> buffer_size<7A> min_count)<04>configrrr(<00>r4<00>config/base.py<70>
get_configsL
r6)rr6r4r4r4r5<00><module>s