Bit-complement (~) does a one's-complement negation, plain and simple, flipping each bit to its opposite. It has nothing to do with ordinary two's-complement negation or the negative operator (-).
The tilde operator is used, for example, when you are representing a small set of items using an integer (such as user-chosen options that can be independently enabled). Each bit in the integer represents whether a corresponding item is present in the set or not (or whether an option is set or not). See https://developer.mozilla.o... .
Using binary representation in JavaScript is complicated by the fact that JavaScript has no literal notation for binary integers, unlike many other languages. Historically, binary was sometimes represented by Strings limited to '0' and '1' characters, which is awkward and inefficient.
In modern cryptography, long binary numbers are usually represented in languages such as JavaScript and PHP by character arrays (Strings) storing the binary number as a packed sequence of 8-bit integer pieces (in big-endian or little-endian order). Such strings are usually not printable as characters because the individual bytes are not encodings of printable characters.
Sometimes in cryptography tricky ways are found to generate arrays of 16-bit integers instead of 8-bit ones, using Unicode functions. Thus it is possible to generate and manipulate values having datatypes that do not actually exist in JavaScript.
Bit-complement (~) does a one's-complement negation, plain and simple, flipping each bit to its opposite. It has nothing to do with ordinary two's-complement negation or the negative operator (-).
The tilde operator is used, for example, when you are representing a small set of items using an integer (such as user-chosen options that can be independently enabled). Each bit in the integer represents whether a corresponding item is present in the set or not (or whether an option is set or not). See https://developer.mozilla.o... .
Using binary representation in JavaScript is complicated by the fact that JavaScript has no literal notation for binary integers, unlike many other languages. Historically, binary was sometimes represented by Strings limited to '0' and '1' characters, which is awkward and inefficient.
In modern cryptography, long binary numbers are usually represented in languages such as JavaScript and PHP by character arrays (Strings) storing the binary number as a packed sequence of 8-bit integer pieces (in big-endian or little-endian order). Such strings are usually not printable as characters because the individual bytes are not encodings of printable characters.
Sometimes in cryptography tricky ways are found to generate arrays of 16-bit integers instead of 8-bit ones, using Unicode functions. Thus it is possible to generate and manipulate values having datatypes that do not actually exist in JavaScript.