Category "sql"

What is the optimized/best way to retrieve data from two tables?

I have two tables: post table: |post_id | post_title | +--------+------------+ | 1 | Post 1 | | 2 | Post 2 | | 3 | Post 3 | post_cre

SQL tuple/lexicographic comparison with multiple directions

I need to return elements from a database query based on an inequality using the lexicographic ordering on multiple columns. As described in this question this

how to sum results of a calculated column

I'm trying to get the total duration of membership types (member&casual) another application of the same problem would be to get the total count of the popu

Overflow when looping through SQL-inserts for an ADODB.connection

I am trying to insert real-time data from a financial service provider into Excel, which can only be fetched with an Excel-plugin in blocks of roughly 100,000 v

How to find out count month vise in particular day range sql?

Like date count users 2 jan - 7 feb 2 feb - 7 march

SUM Values by Sequence Number and Group By Flag

We have a list with a sequence number. The sequence will break, then begin again. As you can see below, the SalesOrderLine is missing the number 4. SalesOrder

create a new column that contains a list of values from another column subsequent rows

I have a table like below, and want to create a new column that contains a list of values from another column subsequent rows like below, for copy paste: time

translating SQLcipher GUI to CLI (or pyhton)

The question is simple How can i turn this SQLciper GUI Into lines of code either in SQLciper CLI or using a python module ? Here's what i've tried (FAILED ATTE

Case insensitive collation still uses case sensitive comparison

According to the postgress documentation a collation can be created to ignore cases during comparison operations. CREATE COLLATION IF NOT EXISTS case_insensitiv

How do I select the columns of a table in databricks sql?

I can use: show columns in table_name but this does not allow me to use the output in a query? This throws an error: SELECT * FROM show columns in table_name

How to find all paths recursively from relational table in SQL Server using transitive property?

I would like to be able to recursively find all relationships in a table. I have a relational table, and essentially I would like to apply the transitive proper

comment SQL query with variables in python

I'm working with Cloud Function. I have the following query working correctly: # this is working q = """ SELECT col1, col2 FROM `my_table` WHER

Write a Query in MySql server that prints a list of employee names who have been employed for less than 10 months having salary>2000

I want a SQL query that prints a list of employee names who have been employed for less than 10 months having salary>2000. Sort this result by ascending emp_

need the dv/dt for the below table

How to get the dv/dt of the below table in psql Concept is linear regression - but to determine the slope is the what I am facing issue with. voltage || time

Byte to Varbinary(max) to Base64

Ok so i generate an img in c# and save it as follows: ... using (MemoryStream ms = new MemoryStream()) { bitMap.Save(ms, System.Drawi

Take the nearest row by condition in date, student frame

I am struggling to get the nearest 'Math Test' or 'Biology Test' in (+/- 3 hours) from Test= 'Marked A+' including TestOrder ordering. If 'Math Test' or 'Biolog

SQL Convert numbers into Date

I'm a beginner in SQL and I need your help please! I have a project which is convert Social Security Numbers into birth date. In the database, I have a column M

Error handling block not executing in snowflake using SQL

I'm trying to implement error handling in snowflake using Try Catch block. Enclosed SQL queries in javascript for applying error handling. When I execute the qu

MySQL format number with unknown number of decimal places

In MySQL, I only want to add thousand separator in the number like 1234.23234, 242343.345345464, 232423.22 and format to "1,234.23234", "242,343.345345464", "23

How to use ISO-8601 date in flink SQL?

Based on my research Flink SQL accepts "0000-01-01 00:00:00.000000000" as the timestamp format, but my timestamps in kafka are coming in "0000-01-01T00:00:00.00