Problems with the Sign-Magnitude Method

Although the sign & magnitude method uses simple representation ideas (one bit for the sign and the rest for the number itself,) working with sign & magnitude numbers creates a few noticeable issues that we just can't ignore:

To prevent a computer from spending extra time on making decisions concerning adding or subtracting numbers, instead of doing these operations in a straightforward way, and to eliminate the ambiguity with the representation of $0$, computers use complement systems to represent integers.