U me& @s*ddlZddlmZGdddejZdS)N)nncsFeZdZdfdd ZddZddZd d Zd d Zd dZZ S)LitEmaH.?Tcst|dks|dkr"tdi|_|dtj|tjd|d|rZtjdtjdntjdtjd| D]@\}}|j rt| d d }|j ||i||| jqtg|_dS) Ng?zDecay must be between 0 and 1decaydtype num_updatesr.)super__init__ ValueError m_name2s_nameregister_buffertorchtensorfloat32intnamed_parameters requires_gradreplaceupdateclonedetachdatacollected_params)selfmodelrZuse_num_upatesnamepZs_name __class__-/home/zhengxiao/ControlNet/ldm/modules/ema.pyrs  zLitEma.__init__cCs |`|dtjdtjddS)Nr rr)r rrrr)rr$r$r%reset_num_updatesszLitEma.reset_num_updatesc Cs|j}|jdkr:|jd7_t|jd|jd|j}d|}tt|}t|}|D]\}||jr|j |}|| ||||<|| |||||qh||j ksht qhW5QRXdS)Nr r) rr minrno_graddictr named_buffersrrtype_assub_AssertionError)rrrZone_minus_decaym_param shadow_paramskeysnamer$r$r%forwards       zLitEma.forwardcCs\t|}t|}|D]:}||jrH||j||j|jq||jkstqdS)N)r+rr,rrcopy_rr/)rrr0r1r2r$r$r%copy_to2s    zLitEma.copy_tocCsdd|D|_dS)z Save the current parameters for restoring later. Args: parameters: Iterable of `torch.nn.Parameter`; the parameters to be temporarily stored. cSsg|] }|qSr$)r).0paramr$r$r% Bsz LitEma.store..N)r)r parametersr$r$r%store;sz LitEma.storecCs(t|j|D]\}}|j|jq dS)a Restore the parameters stored with the `store` method. Useful to validate the model with EMA parameters without affecting the original optimization process. Store the parameters before the `copy_to` method. After validation (or model saving), use this to restore the former parameters. Args: parameters: Iterable of `torch.nn.Parameter`; the parameters to be updated with the stored parameters. N)ziprrr5)rr:Zc_paramr8r$r$r%restoreDs zLitEma.restore)rT) __name__ __module__ __qualname__rr&r4r6r;r= __classcell__r$r$r"r%rs   r)rrModulerr$r$r$r%s