Minimum Time Visiting All Points

Updated on 18 June, 2025
Minimum Time Visiting All Points header image

Problem Statement

On a standard two-dimensional plane, we are given a set of n points where each point has integer coordinates represented by points[i] = [xi, yi]. The task is to determine the minimum time required to traverse these points sequentially in a given order. Movement on this plane adheres to specific rules:

  • In one second, you may move:

    • One unit vertically, or
    • One unit horizontally, or
    • One unit diagonally (one unit both vertically and horizontally)

Diagonal movement covers both axes simultaneously in one second, offering an optimal way to traverse both axes together.

Examples

Example 1

Input:

points = [[1,1],[3,4],[-1,0]]

Output:

7

Explanation:

One optimal path is:
[1,1] -> [3,4] takes 3 seconds (dx=2, dy=3, max=3)  
[3,4] -> [-1,0] takes 4 seconds (dx=4, dy=4, max=4)  
Total = 3 + 4 = 7 seconds

Example 2

Input:

points = [[3,2],[-2,2]]

Output:

5

Explanation:

Only horizontal movement is needed (dx=5, dy=0), so total time = 5 seconds

Constraints

  • points.length == n
  • 1 <= n <= 100
  • points[i].length == 2
  • -1000 <= points[i][0], points[i][1] <= 1000

Approach and Intuition

To solve this problem, we need to calculate the time taken to move between each pair of consecutive points using the movement rules:

  1. Key Insight: The minimal time to move from point A to B is:

    max(|xi - xj|, |yi - yj|)

    because diagonal movement reduces both x and y distance in one step. After you can no longer move diagonally, only one of the coordinates will need further reduction.

  2. Steps:

    • Initialize total_time = 0
    • Iterate over pairs of consecutive points.
    • For each pair, calculate dx = |x2 - x1| and dy = |y2 - y1|
    • Add max(dx, dy) to total_time
    • Return the accumulated total_time
  3. Complexity:

    • Time: O(n) where n is the number of points
    • Space: O(1) since we only track total time

This logic ensures optimal traversal time under the given movement constraints and handles all edge cases efficiently.

Solutions

  • C++
  • Java
  • Python
cpp
class Solution {
public:
    int minimumTime(vector<vector<int>>& coordinates) {
        int totalTime = 0;
        for (int j = 0; j < coordinates.size() - 1; j++) {
            int startX = coordinates[j][0];
            int startY = coordinates[j][1];
            int endX = coordinates[j + 1][0];
            int endY = coordinates[j + 1][1];
            totalTime += max(abs(endX - startX), abs(endY - startY));
        }
            
        return totalTime;
    }
};

The solution is written in C++ and is designed to find the minimum time needed to visit all points on a 2D plane provided in a vector of coordinates. The function minimumTime takes a vector of vector integers as input, representing the coordinates.

Here's how the solution works:

  1. Initialize a variable totalTime to zero to keep track of the total minimum time required.
  2. Loop through each point in the coordinates up to the second last point:
    • Extract the x and y coordinates of the current point and the next point.
    • Calculate the time to move from the current point to the next point. This is determined by the maximum of the absolute differences between the x-coordinates and y-coordinates of the two points. This effectively measures the time in terms of steps required in a chebyshev distance, where moving diagonally also counts as a single step.
  3. Accumulate the time calculated for each movement in totalTime.
  4. After the loop, return totalTime as the outcome.

This approach ensures that the time computed is the minimal required, considering that the path taken between consecutive points is the most direct one allowed by the problem's constraints. The correctness of the program hinges on the fact that moving diagonally (or in straight lines along one axis) is considered optimal in minimizing steps due to the grid's layout, which is a common characteristic in many pathfinding scenarios on grids.

java
class Solution {
    public int minimumTimeToVisit(int[][] coordinates) {
        int totalTime = 0;
        for (int i = 0; i < coordinates.length - 1; i++) {
            int startX = coordinates[i][0];
            int startY = coordinates[i][1];
            int endX = coordinates[i + 1][0];
            int endY = coordinates[i + 1][1];
            totalTime += Math.max(Math.abs(endX - startX), Math.abs(endY - startY));
        }
            
        return totalTime;
    }
}

The given Java program is designed to find the minimum time required to visit all points in a plane with Cartesian coordinates. It effectively calculates the time taken to move in a grid pattern from one point to another point sequentially in the least amount of time.

Here's a breakdown of how the program works:

  • It defines a class Solution with a method minimumTimeToVisit that accepts a 2D array coordinates representing points on a plane.
  • The method initializes totalTime to zero, which will store the accumulated minimum time required to travel between consecutive points.
  • Within a loop, it calculates the distance between consecutive points found in coordinates. For each pair of consecutive points defined by (startX, startY) and (endX, endY), it computes the time to move from one point to another.
  • The time calculation between two points relies on identifying the greater of the horizontal or vertical distance using the maximum difference (Math.max) between x-coordinates and y-coordinates. This approach ensures moving in a direct line either horizontally, vertically, or diagonally, which is akin to a Chebyshev distance calculation.

The method finally returns the total computed time, totalTime, which represents the minimum time required to traverse all given coordinates in order.

Use the minimumTimeToVisit method by providing an array of coordinates and it will efficiently compute and return the minimum time required to cover all points. This function proves especially useful in scenarios involving grid-based routing or pathfinding, such as robot movement simulations or game character path calculations.

python
class Solution:
    def computeMinimumTime(self, coordinates: List[List[int]]) -> int:
        total_time = 0
        for index in range(len(coordinates) - 1):
            x_initial, y_initial = coordinates[index]
            x_next, y_next = coordinates[index + 1]
            total_time += max(abs(x_next - x_initial), abs(y_next - y_initial))
            
        return total_time

Title: Minimum Time Visiting All Points

This tutorial explains how to compute the minimum time required to move between a series of points on a grid where movement from one point to another costs time equal to the maximum of the absolute differences in their x-coordinates or y-coordinates.

The solution provided is written in Python and involves a straightforward approach. Use the following steps to understand the code:

  1. Define a method computeMinimumTime that accepts a list of coordinate pairs.

  2. Initialize a variable total_time to zero. This will accumulate the total time spent moving from point to point.

  3. Iterate through each point in the list except the last one:

    • Extract the current point coordinates (x_initial, y_initial) and the next point coordinates (x_next, y_next).
    • Calculate the time to move to the next point using the maximum of the absolute difference between x-coordinates and y-coordinates of these two points.
    • Add this time to total_time.
  4. After iterating through all points, return the total_time.

In conclusion, this method effectively calculates the required minimum time using the Chebyshev distance between consecutive points and accumulates this time to get the total time for visiting all specified points.

Comments

No comments yet.