I have written a couple of articles around this; relating to transmitting large files over WCF and enabling filestream in SQL. This article deals with actually inserting one of those large files into the DB and retrieving it back out again.
The following method does not use FILESTREAM; that requires a slightly different syntax.
The Database
If you have a look at the linked articles, you’ll already have seen how the data that I’m dealing with is arranged; however, here’s a create statement for the table; just in case you want to try this:
CREATE TABLE [dbo].[BinaryDataTest](
[ROWGUID] [uniqueidentifier] ROWGUIDCOL NOT NULL,
[DataName] [nchar](10) NOT NULL,
[Data] [varbinary](max) FILESTREAM NULL,
[Data2] [varbinary](max) NULL,
UNIQUE NONCLUSTERED
(
[ROWGUID] ASC
)WITH (PAD\_INDEX = OFF, STATISTICS\_NORECOMPUTE = OFF, IGNORE\_DUP\_KEY = OFF, ALLOW\_ROW\_LOCKS = ON, ALLOW\_PAGE\_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE\_ON [PRIMARY] FILESTREAM\_ON [fs\_fg\_filestream]
For completeness, my DB is called TestDB.
You’ll notice that `Data` uses FILESTREAM. However, I won’t cover that in this post.
The Service
Here’s an example of how you would write the insert statement in your service (the same method should work whether or not a service is used):
public void InsertData(Stream value)
{
string connectionString = ConfigurationManager.ConnectionStrings["TestDB"].ConnectionString;
using (SqlConnection cn = new SqlConnection(connectionString))
using (SqlCommand cmd = cn.CreateCommand())
{
cn.Open();
cmd.CommandText = $"INSERT INTO [dbo].[BinaryDataTest] (" +
"[ROWGUID],[DataName],[Data2] ) " +
"VALUES (NEWID(), 'test', @DataVarBinary)";
cmd.CommandType = System.Data.CommandType.Text;
MemoryStream newStream = new MemoryStream();
value.CopyTo(newStream);
SqlParameter sqlParameterBin = new
SqlParameter("@DataVarBinary", SqlDbType.VarBinary);
sqlParameterBin.Value = new SqlBytes(newStream);
cmd.Parameters.Add(sqlParameterBin);
cmd.ExecuteNonQuery();
}
}
As you can see, I have a connection string called “TestDB”; other than that, I think the only remarkable thing (that is: thing worthy of remark - not astounding) is the SqlParameter set-up. Use the VarBinary SQL type, and the ADO.NET SQL function SqlBytes(), and you’re good to go.
Next, there’s the data retrieval:
public Stream GetData(string dataName)
{
string connectionString = ConfigurationManager.ConnectionStrings["TestDB"].ConnectionString;
using (SqlConnection cn = new SqlConnection(connectionString))
using (SqlCommand cmd = cn.CreateCommand())
{
cn.Open();
cmd.CommandText = "SELECT [ROWGUID],[DataName],[Data],[Data2]" +
" FROM [dbo].[BinaryDataTest]" +
" WHERE DataName = @DataName";
cmd.CommandType = System.Data.CommandType.Text;
cmd.Parameters.AddWithValue("DataName", dataName);
using (var rdr = cmd.ExecuteReader())
{
while (rdr.Read())
{
Stream str = rdr.GetStream(rdr.GetOrdinal("Data2"));
return str;
}
}
}
throw new Exception("Invalid data");
}
A familiar looking idea. You’ll see that I’m only returning that `Data2` as stated earlier, and am using the SqlDataReader.GetStream() function.
The Client
I’m deliberately missing out the configuration that enables you to send these files, and which is documented here.
Here’s the Main() function of a client console app:
static void Main(string[] args)
{
ServiceReference1.Service1Client svc = new ServiceReference1.Service1Client();
Stream stream = File.OpenRead(@"c:\\tmp\\test.bmp");
svc.InsertData(stream);
Stream strDest = File.OpenWrite(@"c:\\tmp2\\testdestination.bmp");
Stream str2 = svc.GetData("test");
str2.CopyTo(strDest);
}
Summary
So, we’re reading a file from c:\tmp into a stream, and sending that, via WCF into the SQL DB. Then, we’re reading that back out of the SQL DB, and sending it back over to the client. The client then writes this out to a file.
I fully intend to cover how this differs in a FILESTREAM column in a later post.