'SQL query reporting what I think is the wrong min value using min aggregate and where operator to filter

I am taking an SQL course on coursera and this question came up on a practice quiz

"Excluding $0.00, what is the minimum bi-weekly high rate of pay (please include the dollar sign and decimal point in your answer)?

Now I got the right answer (Which returned $100.00) using

SELECT 
    min(Biweekly_High_Rate)
FROM
    salary_range_by_job_classification
WHERE Biweekly_High_Rate <> '$0.00'

however, when I query using SELECT * from the same table, I can see that, at least in the rows retrieved, that there is a value of $15.00 which is less than $100.00. Why is it not returning $15.00? table snip of SELECT * FROM salary_range_by_job_classification



Solution 1:[1]

I would recommend removing the dollar sign and converting the resultant string to a decimal amount. Enter a decimal amount as your limiting parameter (0.00) as well, so you are comparing two numbers, not two strings. This should result in something more expected:

SELECT MIN(CONVERT(DECIMAL(10,2),REPLACE(BiWeekly_High_Rate,'$','')))
FROM salary_range_by_job_classification
WHERE CONVERT(DECIMAL(10,2),REPLACE(BiWeekly_High_Rate,'$','')) <> 0.00

See how that goes. If you compare two strings, you are asking for (I think) the first value alphabetically by using the MIN aggregation.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 High Plains Grifter