'How does tf.keras.metrics.TopKCategoricalAccuracy differ from Precision@k?

Coming from recommender systems, precision@k is a popular metric.

precision@k = number of relevant predictions in top k / k

On the tensorflow docs for tf.keras.metrics.TopKCategoricalAccuracy it states

Computes how often targets are in the top K predictions.

https://www.tensorflow.org/api_docs/python/tf/keras/metrics/TopKCategoricalAccuracy

Which seems to be exactly the same as precision@k. Am I missing something or are they equivalent and it just comes down to TF/recommender terminology?



Solution 1:[1]

TopKCategoricalAccuracy and Precison at k , these two metrics are different from each other. let us see one example.

For instance in the recommendation usecase , we predict 5 movies ["A", "B", "C","D","F"] for a user and the user viewed Movie 'A' and rejected the rest.

  1. Then precision at 1 = 1/1= 1
  2. Then precision at 5 = 1/5(among 5 movies user select only one)
  3. Top1CategoricalAccuracy(K=1)= 1 or 100%(Because in the prediction list the First movie 'A' was seen by the user)
  4. Top5CategoricalAccuracy(K=5)= 1 or 100%(the right answer appears in your top five guesses)

Solution 2:[2]

No. you are totally right. They both are the same metric.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 tinu mohan
Solution 2 Hamidreza Hosseinkhani