Here is the full interview question "CPU Usage Analysis - Uber Coding Interview | ShowOffer" from Uber on ShowOffer.io:
CPU Usage Analysis Uber Easy
You are given two integer arrays cpu and gpu, where cpu[i] and gpu[i] represent the CPU and GPU usage at time i. You need to determine if there exists an integer k such that the sum of the CPU and GPU usage at time k is exactly 1. In other words, you need to find if there exists an integer k such that cpu[k] + gpu[k] = 1.
Constraints:
1 <= cpu.length == gpu.length <= 10^50 <= cpu[i], gpu[i] <= 1Examples:
cpu = [0.1, 0.3, 0.2], gpu = [0.3, 0.1, 0.5]
Truecpu[2] + gpu[2] = 1cpu = [0.5, 0.2, 0.1], gpu = [0.1, 0.3, 0.4]
FalseHints:
k efficiently.Solution:
To solve this problem, you can iterate through the arrays cpu and gpu simultaneously and check if there exists an index k such that cpu[k] + gpu[k] = 1. Since the constraints allow for a linear time complexity solution, you can simply use a for loop to go through the arrays.
Here's a Python code snippet to solve the problem:
python def find_cpu_gpu_usage(cpu, gpu): for i in range(len(cpu)): if cpu[i] + gpu[i] == 1: return True return False
In this code, we iterate through the arrays cpu and gpu using a for loop. If we find an index i such that cpu[i] + gpu[i] = 1, we return True. If we finish iterating through the arrays without finding such an index, we return False.
This solution has a time complexity of O(n), where n is the length of the input arrays, which satisfies the linear time complexity constraint.