Problem Statement
Given an array of points where points[i] = [xi, yi] represents a point on the X-Y plane and an integer k, return the k closest points to the origin (0, 0). The distance between two points on the X-Y plane is the Euclidean distance (x1 - x2)^2 + (y1 - y2)^2. You may return the answer in any order. The answer is guaranteed to be unique (except for the order that it is in).[3][8][10]
Input: points = [,[-2,2]], k = 1[1][3]
Output: [[-2,2]]
Explanation: The distance between (1, 3) and the origin is sqrt(10). The distance between (-2, 2) and the origin is sqrt(8). Since sqrt(8) < sqrt(10), [-2,2] is returned.[8][3]
Input: points = [,[5,-1],[-2,4]], k = 2[3]
Output: [,[-2,4]][3]
Explanation:
-: distance = sqrt(18) ≈ 4.24[3]
sqrt(26) ≈ 5.10sqrt(20) ≈ 4.47Input: points = [,,,], k = 3[2][4][1][3] Output: [,,][1][2][3] Explanation: The distances are 1.41, 2.83, 4.24, 5.66. The 3 closest are the first three.[5]
1 <= k <= points.length <= 10^4-10^4 < xi, yi < 10^4[5][8]