Hello All I am not able to upload multiple attachments with postman and I don't understand why
Here there is my Api in order to replace existing files
[HttpPost]
[Route("api/attachments/UpdateMultiple")]
public Result UpdateMultiple(List<int> id)
{
try
{
ATTACHMENT[] results = new ATTACHMENT[] { };
int position = 0;
foreach (int i in id)
{
var file = HttpContext.Current.Request.Files[position];
ATTACHMENT attachment = entities.ATTACHMENT.Where(a => a.IDATTACHMENT ==
i).FirstOrDefault();
byte[] fileBytes = new byte[] { };
using (var ms = new MemoryStream())
{
file.InputStream.CopyTo(ms);
fileBytes = ms.ToArray();
}
attachment.Binarydata= fileBytes;
attachment.TypeFILE = file.ContentType;
attachment.NAMEFILE = file.FileName;
entities.SaveChanges();
results[position] = (attachment);
position++;
}
return new Result("OK", results );
}
catch (Exception e)
{
return new Result("KO", + e);
}
}
Here Postman:
Any Suggestion? I have no Idea what to do
Try adding FromQueryAttribute/FromUriAttribute (depending on the ASP.NET version) to parameter:
public Result UpdateMultiple([FromQuery] List<int> id)
Or
public Result UpdateMultiple([FromUri] List<int> id)
UPD
ATTACHMENT[] results = new ATTACHMENT[] { }; creates an empty array, so accessing it by any index will throw an exception, change it to var results = new ATTACHMENT[id.Count];
Solved it with help of Guru Stron -thank you!
to fix, I had to add/modify:
public Result UpdateMultiple([FromUri] List<int> id){
List<ATTACHMENT> results = new List<ATTACHMENT>();
int pos = 0;
foreach (int i in id)
{
var file = HttpContext.Current.Request.Files[pos];
pos++;
[...]
results.Add(attachment);
[...]
}
}
Related
I am trying to attach large files to a ToDoTask using the Graph Api using the example in the docs for attaching large files for ToDoTask and the recommend class LargeFileUploadTask for uploading large files.
I have done this sucessfully before with attaching large files to emails and sending so i used that as base for the following method.
public async Task CreateTaskBigAttachments( string idList, string title, List<string> categories,
BodyType contentType, string content, Importance importance, bool isRemindOn, DateTime? dueTime, cAttachment[] attachments = null)
{
try
{
var _newTask = new TodoTask
{
Title = title,
Categories = categories,
Body = new ItemBody()
{
ContentType = contentType,
Content = content,
},
IsReminderOn = isRemindOn,
Importance = importance
};
if (dueTime.HasValue)
{
var _timeZone = TimeZoneInfo.Local;
_newTask.DueDateTime = DateTimeTimeZone.FromDateTime(dueTime.Value, _timeZone.StandardName);
}
var _task = await _graphServiceClient.Me.Todo.Lists[idList].Tasks.Request().AddAsync(_newTask);
//Add attachments
if (attachments != null)
{
if (attachments.Length > 0)
{
foreach (var _attachment in attachments)
{
var _attachmentContentSize = _attachment.ContentBytes.Length;
var _attachmentInfo = new AttachmentInfo
{
AttachmentType = AttachmentType.File,
Name = _attachment.FileName,
Size = _attachmentContentSize,
ContentType = _attachment.ContentType
};
var _uploadSession = await _graphServiceClient.Me
.Todo.Lists[idList].Tasks[_task.Id]
.Attachments.CreateUploadSession(_attachmentInfo).Request().PostAsync();
using (var _stream = new MemoryStream(_attachment.ContentBytes))
{
_stream.Position = 0;
LargeFileUploadTask<TaskFileAttachment> _largeFileUploadTask = new LargeFileUploadTask<TaskFileAttachment>(_uploadSession, _stream, MaxChunkSize);
try
{
await _largeFileUploadTask.UploadAsync();
}
catch (ServiceException errorGraph)
{
if (errorGraph.StatusCode == HttpStatusCode.InternalServerError || errorGraph.StatusCode == HttpStatusCode.BadGateway
|| errorGraph.StatusCode == HttpStatusCode.ServiceUnavailable || errorGraph.StatusCode == HttpStatusCode.GatewayTimeout)
{
Thread.Sleep(1000); //Wait time until next attempt
//Try again
await _largeFileUploadTask.ResumeAsync();
}
else
throw errorGraph;
}
}
}
}
}
}
catch (ServiceException errorGraph)
{
throw errorGraph;
}
catch (Exception ex)
{
throw ex;
}
}
Up to the point of creating the task everything goes well, it does create the task for the user and its properly shown in the user tasks list. Also, it does create an upload session properly.
The problem comes when i am trying to upload the large file in the UploadAsync instruction.
The following error happens.
Code: InvalidAuthenticationToken Message: Access token is empty.
But according to the LargeFileUploadTask doc , the client does not need to set Auth Headers.
param name="baseClient" To use for making upload requests. The client should not set Auth headers as upload urls do not need them.
Is not LargeFileUploadTask allowed to be used to upload large files to a ToDoTask?
If not then what is the proper way to upload large files to a ToDoTask using the Graph Api, can someone provide an example?
If you want, you can raise an issue for the same with the details here, so that they can have look: https://github.com/microsoftgraph/msgraph-sdk-dotnet-core/issues.
It seems like its a bug and they are working on it.
Temporarily I did this code to deal with the issue of the large files.
var _task = await _graphServiceClient.Me.Todo.Lists[idList].Tasks.Request().AddAsync(_newTask);
//Add attachments
if (attachments != null)
{
if (attachments.Length > 0)
{
foreach (var _attachment in attachments)
{
var _attachmentContentSize = _attachment.ContentBytes.Length;
var _attachmentInfo = new AttachmentInfo
{
AttachmentType = AttachmentType.File,
Name = _attachment.FileName,
Size = _attachmentContentSize,
ContentType = _attachment.ContentType
};
var _uploadSession = await _graphServiceClient.Me
.Todo.Lists[idList].Tasks[_task.Id]
.Attachments.CreateUploadSession(_attachmentInfo).Request().PostAsync();
// Get the upload URL and the next expected range from the response
string _uploadUrl = _uploadSession.UploadUrl;
using (var _stream = new MemoryStream(_attachment.ContentBytes))
{
_stream.Position = 0;
// Create a byte array to hold the contents of each chunk
byte[] _chunk = new byte[MaxChunkSize];
//Bytes to read
int _bytesRead = 0;
//Times the stream has been read
var _ind = 0;
while ((_bytesRead = _stream.Read(_chunk, 0, _chunk.Length)) > 0)
{
// Calculate the range of the current chunk
string _currentChunkRange = $"bytes {_ind * MaxChunkSize}-{_ind * MaxChunkSize + _bytesRead - 1}/{_stream.Length}";
//Despues deberiamos calcular el next expected range en caso de ocuparlo
// Create a ByteArrayContent object from the chunk
ByteArrayContent _byteArrayContent = new ByteArrayContent(_chunk, 0, _bytesRead);
// Set the header for the current chunk
_byteArrayContent.Headers.Add("Content-Range", _currentChunkRange);
_byteArrayContent.Headers.Add("Content-Type", _attachment.ContentType);
_byteArrayContent.Headers.Add("Content-Length", _bytesRead.ToString());
// Upload the chunk using the httpClient Request
var _client = new HttpClient();
var _requestMessage = new HttpRequestMessage()
{
RequestUri = new Uri(_uploadUrl + "/content"),
Method = HttpMethod.Put,
Headers =
{
{ "Authorization", bearerToken },
}
};
_requestMessage.Content = _byteArrayContent;
var _response = await _client.SendAsync(_requestMessage);
if (!_response.IsSuccessStatusCode)
throw new Exception("File attachment failed");
_ind++;
}
}
}
}
}
asp.net core MVC - framework net6.0
I have a page in which i upload an image and save it to the db.
the file i'm getting from the view is IFormFile.
I want to be able to check the resolution (width and height) of the photo before saving in DB.
Can it be done with IFormFile?
here is the controller that handles the file :
public JsonResult Submit(IFormFile PhotoFile)
{
int success = 0;
string excep = "";
try
{
if (PhotoFile.Length > 0)
{
using (var ms = new MemoryStream())
{
PhotoFile.CopyTo(ms);
var fileBytes = ms.ToArray();
}
}
ApplicationUser appUser =
_unitOfWork.ApplicationUser.GetAll().Where(a => a.UserName == User.Identity.Name).FirstOrDefault();
if (appUser != null)
{
FileUpload fileUpload = new FileUpload()
{
file = PhotoFile,
CompanyId = appUser.CompanyId
};
SaveFile(fileUpload);
}
excep = "success";
success = 1;
return Json(new { excep, success });
}
catch (Exception ex)
{
excep = "fail";
success = 0;
return Json(new { excep, success });
}
}
public string SaveFile(FileUpload fileObj)
{
Company company = _unitOfWork.Company.GetAll().
Where(a => a.Id == fileObj.CompanyId).FirstOrDefault();
if(company != null && fileObj.file.Length > 0)
{
using (var ms = new MemoryStream())
{
fileObj.file.CopyTo(ms);
var fileBytes = ms.ToArray();
company.PhotoAd = fileBytes;
_unitOfWork.Company.Update(company);
_unitOfWork.Save();
return "Saved";
}
}
return "Failed";
}
As far as I know this isn't possible with just IFormFile and you need System.Drawing.Common.
So first you need to convert it like:
using var image = Image.FromStream(PhotoFile.OpenReadStream());
Then you can simply get the height/width with image.height and image.width
I am trying to save data from public Web API to a txt file. However, it seems that somwhere here
using (var fs = FileService.CreateFile("filename.txt"))
{
// Add some text to file
var title = new UTF8Encoding(true).GetBytes(strContent);
fs.WriteAsync(title, 0, strContent.Length);
}
I am making a mistakes as I am missing some data at the end.
public void GetData()
{
var path = "https://www.cnb.cz/cs/financni-trhy/devizovy-trh/kurzy-devizoveho-trhu/kurzy-devizoveho-trhu/denni_kurz.txt";
string strContent;
var webRequest = WebRequest.Create(path);
using (var response = webRequest.GetResponse())
using(var content = response.GetResponseStream())
using(var reader = new StreamReader(content))
{
strContent = reader.ReadToEnd();
}
using (var fs = FileService.CreateFile("filename.txt"))
{
// Add some text to file
var title = new UTF8Encoding(true).GetBytes(strContent);
fs.WriteAsync(title, 0, strContent.Length);
}
var file = File.ReadAllLines(FileService.ReturnBinLocation("filename.txt"));
var results = new List<string>();
for (var a = 0; a < file.Length; a++)
{
results.Add(file[a]);
File.WriteAllLines(data, results);
}
var sub2 = File.ReadAllText(data);
sub2 = sub2.Replace('\n', '|').TrimEnd('|');
var split = sub2.Split('|');
var list = new List<DailyCourse>();
var i= 0;
do
{
var model = new DailyCourse();
model.Country = split[i]; i++;
model.Currency = split[i]; i++;
model.Amount = split[i]; i++;
model.Code = split[i]; i++;
model.Course = split[i]; i++;
list.Add(model);
} while ( i < split.Length);
var json = JsonSerializer.Serialize(list);
}
public static class FileService
{
public static FileStream CreateFile(string fileName)
{
var wholePath = ReturnBinLocation(fileName);
if (File.Exists(wholePath))
{
File.Delete(wholePath);
}
return File.Create(wholePath);
}
public static string ReturnBinLocation( string fileName)
{
var binPath = Path.GetDirectoryName(System.Reflection.Assembly.GetExecutingAssembly().GetName().CodeBase );
var wholePath = Path.Combine(binPath, fileName);
int endIndex = wholePath. Length - 5;
var sub = wholePath.Substring(5, endIndex);
return sub;
}
}
I have actually found out that it was this "Encoding.UTF8.GetBytes"when I switched it to Encoding.ASCII.GetBytes. It worked
We have a web API application which runs on .net4.6.1. We have tried several times to figure out the root cause where it is getting deadlock, but failed. Below is the code snippet. We are hitting this API endpoint every 1 minute. It will pick 300 transaction at a time for processing from the DB. We have observed that it get stuck when there are no files to process from the DB. Not sure though. It would be helpful if someone can help us.TIA
public class TaxEngineIntegratorController : ApiController
{
public async Task Get(int id)
{
try
{
await MainFileMethod();
}
catch (Exception Ex)
{
SerilogMethods.LogError(log, Ex, "Get");
}
}
public async Task MainFileMethod()
{
List<FileTransaction> lstFTtoLock = new List<FileTransaction>();
try
{
List<int> lstStatusIds = new List<int>();
lstStatusIds.Add(objStatusManager.GetStatusIdbyName(Status.ConversionToXmlSucceded));
lstStatusIds.Add(objStatusManager.GetStatusIdbyName(Status.Reprocess));
//Getting the serviceURL of TRTaxEngine
string seriviceURL = objConfigManager.GetConfigurationdbyKey(ConfigurationList.TRTaxEngineURL);
//Getting the output path for the file to be placed after processing
string outputfilePath = objConfigManager.GetConfigurationdbyKey(ConfigurationList.TRTaxOutputXMLFolder);
FileMasterManager objFileMasterManager = new FileMasterManager();
TRTaxXMLOperations objxmlresp = new TRTaxXMLOperations();
//Getting all the files list for proccessing from the DB
List<FileTransaction> lstFiletoProcess = await objTransManager.GetFileListforProcessingAsync(lstStatusIds, true);
lstFTtoLock = lstFiletoProcess;
if (lstFiletoProcess.Count == 0)
return;
if (lstFiletoProcess.Count > 0)
{
var tasks = new List<Task<string>>();
using (HttpClient httpClnt = new HttpClient())
{
httpClnt.Timeout = TimeSpan.FromMilliseconds(-1);
foreach (FileTransaction item in lstFiletoProcess)
{
TRXMLResponseModel objRespModel = new TRXMLResponseModel();
objRespModel.strxmlResponse = string.Empty;
string fullFileName = item.FilePath + item.ConvertedName;
objRespModel.outputFilename = outputfilePath + item.ConvertedName;
FileMaster fileMaster = objFileMasterManager.GetById(item.FileId);
//Proccessing the file and getting the output filedata
Task<string> t = objxmlresp.GetXMLResponse(seriviceURL, fullFileName, fileMaster.CountryId.GetValueOrDefault(), httpClnt, objFileOperation, objRespModel.outputFilename, item);
tasks.Add(t);
objRespModel.strxmlResponse = await t;
}
var result = await Task.WhenAll(tasks);
}
SerilogMethods.LogCustomException(log, "Http Client Destroyed in Tax Engine", "GetXMLResponse");
}
}
catch (Exception Ex)
{
if (lstFTtoLock != null && lstFTtoLock.Count > 0)
{
objTransManager.UpdateFileTransactionIsPickedtoFalse(lstFTtoLock);
}
throw Ex;
}
}
}
//Getting all the files list for proccessing from the DB
public async Task<List<FileTransaction>> GetFileListforProcessingAsync(List<int> lstStatusList, bool IsActive)
{
try
{
List<FileTransaction> lstFTList = new List<FileTransaction>();
using (SUTBACDEVContext db = new SUTBACDEVContext())
{
//DataTable dtFileTransactions = GetFileTransactionListAsync(lstStatusList, IsActive);
string connectionString = db.Database.GetDbConnection().ConnectionString;
var conn = new SqlConnection(connectionString);
string query = #"[SUTGITA].[GetFileListforProcessing]";
using (var sqlAdpt = new SqlDataAdapter(query, conn))
{
sqlAdpt.SelectCommand.CommandType = CommandType.StoredProcedure;
sqlAdpt.SelectCommand.Parameters.AddWithValue("#StatusId", string.Join(",", lstStatusList.Select(n => n.ToString()).ToArray()));
sqlAdpt.SelectCommand.Parameters.AddWithValue("#IsActive", IsActive);
sqlAdpt.SelectCommand.CommandTimeout = 60000;
DataTable dtFileTransactions = new DataTable();
sqlAdpt.Fill(dtFileTransactions);
if (dtFileTransactions != null && dtFileTransactions.Rows.Count > 0)
{
IEnumerable<long> ids = dtFileTransactions.AsEnumerable().ToList().Select(p => p["id"]).ToList().OfType<long>();
lstFTList = await db.FileTransaction.Include(x => x.File.Country).Where(x => ids.Contains(x.Id)).OrderBy(x => x.Id).ToListAsync();
}
}
}
return lstFTList;
}
catch (Exception ex)
{
throw ex;
}
}
public async Task<string> GetXMLResponse(string baseUrl, string fullFileName, int countryId, HttpClient client, FileOperations objFileOperation, string outputfilePath, FileTransaction item)
{
try
{
var fileData = new StringBuilder(objFileOperation.ReadFile(fullFileName));
using (HttpContent content = new StringContent(TransformToSOAPXml(fileData, countryId), Encoding.UTF8, "text/xml"))
{
using (HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Post, baseUrl))
{
request.Headers.Add("SOAPAction", "");
request.Content = content;
using (HttpResponseMessage response = await client.SendAsync(request, HttpCompletionOption.ResponseHeadersRead))
{
response.EnsureSuccessStatusCode();
if (response.IsSuccessStatusCode)
{
using (Stream streamToReadFrom = await response.Content.ReadAsStreamAsync())
{
using (Stream streamToWriteTo = File.Open(outputfilePath, FileMode.Create))
{
await streamToReadFrom.CopyToAsync(streamToWriteTo);
}
}
var transactionEntry = new FileTransaction
{
FileId = item.FileId,
FilePath = outputfilePath,
ConvertedName = item.ConvertedName,
ActionedBy = Process.Process3,
TimeStamp = DateTime.UtcNow,
StatusId = objStatusManager.GetStatusIdbyName(Status.OutputXmlReceived),
IsActive = true,
CreatedBy = Others.Scheduler,
CreatedOn = DateTime.UtcNow,
ModifiedBy = Others.Scheduler,
ModifiedOn = DateTime.UtcNow
};
//Inserting the new record and Updating isActive filed of previous record in Tranasaction table(Calling updateDataonTRSuccess method of TRTaxXMLOperations class)
await updateDataonTRSuccessAsync(item, transactionEntry);
return "Success";
}
else
{
SerilogMethods.LogCustomException(log, "Error occured in Tax Engine", "GetXMLResponse");
//Log the SOAP response when the SOAP fails with an error message
if (response.Content != null)
{
throw new Exception(await response.Content.ReadAsStringAsync());
}
return null;
}
}
}
}
}
catch (Exception ex)
{
SerilogMethods.LogError(log, ex, "GetXMLResponse");
return null;
}
}
The following changes I have done to make it work to this specific method.
Removal of this line : objRespModel.strxmlResponse = await t;
and added configureawait(false) to this line :List lstFiletoProcess = await objTransManager.GetFileListforProcessingAsync(lstStatusIds, true).ConfigureAwait(false); Below is the working code
public async Task MainFileMethod()
{
List<FileTransaction> lstFTtoLock = new List<FileTransaction>();
try
{
List<int> lstStatusIds = new List<int>();
lstStatusIds.Add(objStatusManager.GetStatusIdbyName(Status.ConversionToXmlSucceded));
lstStatusIds.Add(objStatusManager.GetStatusIdbyName(Status.Reprocess));
//Getting the serviceURL of TRTaxEngine
string seriviceURL = objConfigManager.GetConfigurationdbyKey(ConfigurationList.TRTaxEngineURL);
//Getting the output path for the file to be placed after processing
string outputfilePath = objConfigManager.GetConfigurationdbyKey(ConfigurationList.TRTaxOutputXMLFolder);
FileMasterManager objFileMasterManager = new FileMasterManager();
TRTaxXMLOperations objxmlresp = new TRTaxXMLOperations();
//Getting all the files list for proccessing from the DB
List<FileTransaction> lstFiletoProcess = await objTransManager.GetFileListforProcessingAsync(lstStatusIds, true).ConfigureAwait(false);
lstFTtoLock = lstFiletoProcess;
if (lstFiletoProcess.Count == 0)
return;
if (lstFiletoProcess.Count > 0)
{
var tasks = new List<Task<string>>();
using (HttpClient httpClnt = new HttpClient())
{
httpClnt.Timeout = TimeSpan.FromMilliseconds(-1);
//Getting the files for processing
foreach (FileTransaction item in lstFiletoProcess)
{
TRXMLResponseModel objRespModel = new TRXMLResponseModel();
objRespModel.strxmlResponse = string.Empty;
string fullFileName = item.FilePath + item.ConvertedName;
objRespModel.outputFilename = outputfilePath + item.ConvertedName;
FileMaster fileMaster = objFileMasterManager.GetById(item.FileId);
//Proccessing the file and getting the output filedata
Task<string> t = objxmlresp.GetXMLResponse(seriviceURL, fullFileName, fileMaster.CountryId.GetValueOrDefault(), httpClnt, objFileOperation, objRespModel.outputFilename, item, objTransManager);
tasks.Add(t);
//objRespModel.strxmlResponse = await t;
}
var result = await Task.WhenAll(tasks);
}
}
}
catch (Exception Ex)
{
if (lstFTtoLock != null && lstFTtoLock.Count > 0)
{
objTransManager.UpdateFileTransactionIsPickedtoFalse(lstFTtoLock);
}
throw Ex;
}
}
My Recommendation:
The method "Get(int id)" is somewhat confusing. first, it takes "id" and does nothing with it. Also it return nothing so it is not a "Get" method. It is basically asking for all transactions with status "Status.ConversionToXmlSucceded" & "Status.Reprocess" and are active to be gotten and processed via the "objxmlresp.GetXMLResponse" method... You Dont Have To Await the "MainFileMethod();" in "Get(int id)" just return the task or return Ok(); and allow all the process to go on in the background. You can experiment with reducing the "sqlAdpt.SelectCommand.CommandTimeout = 60000;".
Now I have successfully working code (with multiple threads) for items bulk import in IN202500 screen in Acumatica.
The problem is that I am struggling to import an image of an item and actually I don't have an image by itself but only URL link to this image.
So, my question is has anyone done this in c#?
This is my piece of code.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace ItemImportMultiThreaded
{
public class ItemImporter
{
private IN202500.Screen _itemsScreen;
private static object _itemsSchemaLock = new object();
private static IN202500.Content _itemsSchema;
public void Login(string url, string username, string password, string company)
{
Console.WriteLine("[{0}] Logging in to {1}...", System.Threading.Thread.CurrentThread.ManagedThreadId, url);
_itemsScreen = new IN202500.Screen();
_itemsScreen.Url = url + "/PMSDB/(W(2))/Soap/IN202500.asmx";
_itemsScreen.EnableDecompression = true;
_itemsScreen.CookieContainer = new System.Net.CookieContainer();
_itemsScreen.Timeout = 36000;
_itemsScreen.Login(username, password);
Console.WriteLine("[{0}] Logged in to {1}.", System.Threading.Thread.CurrentThread.ManagedThreadId, url);
lock (_itemsSchemaLock)
{
// Threads can share the same schema.
if (_itemsSchema == null)
{
Console.WriteLine("[{0}] Retrieving IN202500 schema...", System.Threading.Thread.CurrentThread.ManagedThreadId);
_itemsSchema = _itemsScreen.GetSchema();
if (_itemsSchema == null) throw new Exception("IN202500 GetSchema returned null. See AC-73433.");
}
}
}
public void Logout()
{
_itemsScreen.Logout();
}
public void Import(List<Item> items)
{
Console.WriteLine("[{0}] Submitting {1} items to Acumatica...", System.Threading.Thread.CurrentThread.ManagedThreadId, items.Count);
var commands = new IN202500.Command[]
{
_itemsSchema.StockItemSummary.InventoryID,
_itemsSchema.StockItemSummary.Description,
_itemsSchema.GeneralSettingsItemDefaults.ItemClass,
_itemsSchema.VendorDetails.VendorID,
_itemsSchema.VendorDetails.VendorInventoryID,
_itemsSchema.VendorDetails.ServiceCommands.NewRow,
_itemsSchema.VendorDetails.VendorID,
_itemsSchema.VendorDetails.VendorInventoryID,
_itemsSchema.VendorDetails.ServiceCommands.NewRow,
_itemsSchema.VendorDetails.VendorID,
_itemsSchema.VendorDetails.VendorInventoryID,
_itemsSchema.CrossReference.AlternateID,
_itemsSchema.CrossReference.Description,
_itemsSchema.Actions.Save
};
string[][] data = new string[items.Count][];
int count = 0;
foreach(Item item in items)
{
data[count] = new string[11];
data[count][0] = item.InventoryID;
data[count][1] = item.Description.Trim();
data[count][2] = item.ItemClassID;
data[count][3] = item.DigiKey;
data[count][4] = item.DKPN;
data[count][5] = item.Mouser;
data[count][6] = item.MouserID;
data[count][7] = item.Element14;
data[count][8] = item.Element14ID;
data[count][9] = item.AlternateID;
data[count][10] = item.Descr;
count++;
}
_itemsScreen.Import(commands, null, data, false, true, true);
Console.WriteLine("[{0}] Submitted {1} items to Acumatica.", System.Threading.Thread.CurrentThread.ManagedThreadId, items.Count);
}
}
}
I tried to use FileStream but that didn't work.
If by URL link you mean an external http resource, you can download the image and upload it.
The StockItems image field cycle through all the images contained in the Files popup in the order they are displayed:
I uploaded the images from a static external Url using the following code:
const string imageUrl = "https://cdn.acumatica.com/media/2016/03/software-technology-industries-small.jpg";
string path = Path.Combine(Path.GetTempPath(), Path.ChangeExtension(Path.GetTempFileName(), ".jpg"));
// Download Image
using (WebClient client = new WebClient())
{
client.DownloadFile(new Uri(imageUrl), path);
}
// ReadUploadFile function below
byte[] data = ReadUploadFile(path);
_itemsScreen.Import(new IN202500.Command[]
{
// Get Inventory Item
new Value
{
Value = "D1",
LinkedCommand = _itemsSchema.StockItemSummary.InventoryID,
},
_itemsSchema.Actions.Save,
// Upload Inventory Item Image
new Value
{
FieldName = Path.GetFileName(path),
LinkedCommand = _itemsSchema.StockItemSummary.ServiceCommands.Attachment
},
_itemsSchema.Actions.Save
},
null,
new string[][]
{
new string[]
{
// Image data
Convert.ToBase64String(data)
}
},
false,
false,
true);
public byte[] ReadUploadFile(string filePath)
{
byte[] filedata;
using (FileStream file = File.Open(filePath,
FileMode.Open,
FileAccess.ReadWrite,
FileShare.ReadWrite))
{
filedata = new byte[file.Length];
file.Read(filedata, 0, filedata.Length);
}
if (filedata == null || filedata.Length == 0)
{
throw new Exception(string.Concat("Invalid or empty file: ", filePath));
}
return filedata;
}
You can try using the below, Tested Code.
var content = _context.CR306000GetSchema(); _context.CR306000Clear();
var commands = new List();
ReqParameter(content, ref commands);
commands.Add(content.Actions.Save);
commands.Add(content.CaseSummary.CaseID);
var orderResults = _context.CR306000Submit(commands.ToArray());
private static void ReqParameter(CR306000Content content, ref List cmds) { if (cmds == null) throw new ArgumentNullException("cmds");
private static void ReqParameter(CR306000Content content, ref List<Command> cmds)
{
if (cmds == null) throw new ArgumentNullException("cmds");
byte[] filedata= null;
Uri uri = new Uri("https://cdn.acumatica.com/media/2016/03/software-technology-industries-small.jpg"); // change the required url of the data that has to be fetched
if (uri.IsFile)
{
string filename = System.IO.Path.GetFileName(uri.LocalPath);
filedata = System.Text.Encoding.UTF8.GetBytes(uri.LocalPath);
}
if (filedata == null)
{
WebClient wc = new WebClient();
filedata = wc.DownloadData(uri);
}
cmds = new List<Command>
{
//Case Header Details
new Value { Value="<NEW>",LinkedCommand = content.CaseSummary.CaseID},
new Value { Value="L41",LinkedCommand = content.CaseSummary.ClassID},
new Value { Value="ABCSTUDIOS",LinkedCommand = content.CaseSummary.BusinessAccount, Commit = true},
new Value { Value="Test subject created from envelop call 11C",LinkedCommand = content.CaseSummary.Subject},
// body of the case
new Value{Value= "Body of the content for created through envelop call 11B", LinkedCommand = content.Details.Description},
//Attaching a file
new Value
{
Value = Convert.ToBase64String(filedata), // byte data that is passed to through envelop
FieldName = "Test.jpg",
LinkedCommand =
content.CaseSummary.ServiceCommands.Attachment
},
};
}
Let me know if this works for you.
Thanks