An integral conversion never produces undefined behaviour (it can produce implementation-defined behaviour).
A conversion to a type that can represent the value being converted is always well-defined: the value simply stays unchanged.
A conversion to an unsigned type is always well-defined: the value is taken modulo UINT_MAX+1 (or whatever maximum value the target type admits).
A conversion to a signed type that cannot represent the value being converted results in either an implementation-defined value, or an implementation-defined signal.
Note that the above rules are defined in terms of integer values and not in terms of sequences of bits.