Traceback (most recent call last): File "/usr/bin/nvidia-detector", line 8, in a = NvidiaDetection(printonly=True, verbose=False) File "/usr/lib/python2.7/dist-packages/NvidiaDetector/nvidiadetector.py", line 68, in __init__ self.getData() File "/usr/lib/python2.7/dist-packages/NvidiaDetector/nvidiadetector.py", line 145, in getData driver_version = self.__get_value_from_name(package.name.split('-', 1)[1]) File "/usr/lib/python2.7/dist-packages/NvidiaDetector/nvidiadetector.py", line 87, in __get_value_from_name v = int(name) ValueError: invalid literal for int() with base 10: '173:i386'