x86 architecture
K1OM transform modifiers




 
MVEX.sss
 
MVEX.sss
and disp8*N
000b 001b 010b 011b 100b 101b 110b 111b
S*(reg) E=0 {dcba} {cdab} {badc} {dacb} {aaaa} {bbbb} {cccc} {dddd}
S*(reg) E=1 {rn} {rd} {ru} {rz} {rn-sae} {rd-sae} {ru-sae} {rz-sae}
S*r(reg) E=0 {dcba} {cdab} {badc} {dacb} {aaaa} {bbbb} {cccc} {dddd}
S*r(reg) E=1 reserved reserved reserved reserved reserved reserved reserved reserved
S*s(reg) E=0 {dcba} {cdab} {badc} {dacb} {aaaa} {bbbb} {cccc} {dddd}
S*s(reg) E=1 reserved reserved reserved reserved {sae0} {sae1} {sae2} {sae3}
 Si32(mem)  {16to16} [/1] {1to16} [/16] {4to16} [/4] reserved {uint8} [/4] {sint8} [/4] {uint16} [/2] {sint16} [/2]
 Sf32(mem)  {16to16} [/1] {1to16} [/16] {4to16} [/4] {float16} [/2] {uint8} [/4] reserved {uint16} [/2] {sint16} [/2]
 Si64(mem)  {8to8} [/1] {1to8} [/8] {4to8} [/2] reserved reserved reserved reserved reserved
 Sf64(mem)  {8to8} [/1] {1to8} [/8] {4to8} [/2] reserved reserved reserved reserved reserved
 Ui32(mem)  {16to16} [/1] reserved reserved reserved {uint8} [/4] {sint8} [/4] {uint16} [/2] {sint16} [/2]
 Uf32(mem)  {16to16} [/1] reserved reserved {float16} [/2] {uint8} [/4] {sint8} [/4] {uint16} [/2] {sint16} [/2]
 Ui64(mem)  {8to8} [/1] reserved reserved reserved reserved reserved reserved reserved
 Uf64(mem)  {8to8} [/1] reserved reserved reserved reserved reserved reserved reserved
 Di32(reg)  {none} [/1] reserved reserved reserved {uint8} [/4] {sint8} [/4] {uint16} [/2] {sint16} [/2]
 Df32(reg)  {none} [/1] reserved reserved {float16} [/2] {uint8} [/4] {sint8} [/4] {uint16} [/2] {sint16} [/2]
 Di64(reg)  {none} [/1] reserved reserved reserved reserved reserved reserved reserved
 Df64(reg)  {none} [/1] reserved reserved reserved reserved reserved reserved reserved

 
special cases
 
MVEX.sss
and disp8*N
000b 001b 010b 011b 100b 101b 110b 111b
 Si64n(r/m)  VALIGND, VPERMD, VPERMF32X4, and VPSHUFD
reg E=0 {dcba} reserved reserved reserved reserved reserved reserved reserved
reg E=1 reserved reserved reserved reserved reserved reserved reserved reserved
 mem  {none} [/1] reserved reserved reserved reserved reserved reserved reserved
 Sf64n(r/m)  VEXP223PS, VLOG2PS, VRCP23PS, and VRSQRT23PS
reg E=0 {dcba} reserved reserved reserved reserved reserved reserved reserved
reg E=1 reserved reserved reserved reserved {sae0} {sae1} {sae2} {sae3}
 mem  {none} [/1] reserved reserved reserved reserved reserved reserved reserved
 Si32b(r/m)  VPMADD233D
reg E=0 {dcba} reserved reserved reserved reserved reserved reserved reserved
reg E=1 reserved reserved reserved reserved reserved reserved reserved reserved
 mem  {16to16} [/1] reserved {4to16} [/4] reserved reserved reserved reserved reserved
 Sf32b(r/m)  VFMADD233PS
reg E=0 {dcba} reserved reserved reserved reserved reserved reserved reserved
reg E=1 {rn} {rd} {ru} {rz} {rn-sae} {rd-sae} {ru-sae} {rz-sae}
 mem  {16to16} [/1] reserved {4to16} [/4] reserved reserved reserved reserved reserved
 Si32c(r/m)  VCVTDQ2PD and VCVTUDQ2PD
reg E=0 {dcba} {cdab} {badc} {dacb} {aaaa} {bbbb} {cccc} {dddd}
reg E=1 reserved reserved reserved reserved reserved reserved reserved reserved
 mem  {8to8} [/1] {1to8} [/8] {4to8} [/2] reserved reserved reserved reserved reserved
 Sf32c(r/m)  VCVTPS2PD
reg E=0 {dcba} {cdab} {badc} {dacb} {aaaa} {bbbb} {cccc} {dddd}
reg E=1 reserved reserved reserved reserved {sae0} {sae1} {sae2} {sae3}
 mem  {8to8} [/1] {1to8} [/8] {4to8} [/2] reserved reserved reserved reserved reserved



main page

© 1996-2024 by Christian Ludloff. All rights reserved. Use at your own risk.