.NET 6.0 - retrieving string array from a specific appsettings.json file - c#

I have two appsettings files in my project: appsettings.json and appsettings.Development.json. I have string arrays in both files: two developers' e-mails in the Development file and seven production e-mails in the main appsettings.json file.
appsettings.Development.json:
"Contacts": {
"Emails": [ "email1", "email2" ],
}
appsettings.json:
"Contacts": {
"Emails": [ "email3", "email4", "email5", "email6", "email7", "email8", "email9" ],
}
I have this setup in my Program.cs:
WebApplicationOptions opts = new WebApplicationOptions
{
ApplicationName = "WebServices",
EnvironmentName = Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT")
};
var builder = WebApplication.CreateBuilder(opts);
builder.Services.Configure<Contacts>(builder.Configuration.GetSection("Contacts"));
Then in my StoreDataService.cs:
private readonly Contacts _contactSettings;
public StoreDataService(IOptions<DatabaseSettings> dbSettings, IOptions<Contacts> contactSettings) : base(dbSettings)
{
_dbSettings = dbSettings.Value;
_contactSettings = contactSettings.Value;
string[] emailList = _contactSettings.Emails;
}
The problem is in my emailList array I'm getting back
[ "email1", "email2", "email5", "email6", "email7", "email8", "email9" ]
in both Development and Production. All other fields come back correctly between the two environments. Is there something special I'm needing to do for the string array?

Related

Problem with accentued char when reading appsettings.json with builder

I am reading the configuration file appsettings.json with the builder;
var builder = new ConfigurationBuilder().SetBasePath(AppDomain.CurrentDomain.BaseDirectory)
.AddJsonFile(Path.Combine(Helper.jsonpath, "appsettings.json"));
Helper.config = builder.Build();
//same result : P�le emploi instead of Pôle emploi
var tt = Helper.config.GetSection("partooApi:types:APE").GetValue<string>("title");
var ttt = Helper.config["partooApi:types:APE:title"];
var tttt = Helper.config.GetSection("partooApi:types:APE:title").Get<string>();
i have the result P�le emploi instead of Pôle emploi
my appsettings.json has nothing special:
"partooApi": {
"types": {
"APE": {
"title": "Pôle emploi",
"grpname": "Agences",
"grpid": 10661
}
:
I dont see options to encode, am i missing something in my code?
Have you checked if appsettings.json is saved as UTF-8 encoding?

getting "The bucket does not allow ACLs" Error

This is my bucket policy
{
"Version" : "2012-10-17",
"ID" : "************",
"Statement" : [
{
"Sid" : "************",
"Effect" : "Allow",
"Principar" : "*",
"Action" : [
"s3:PutObject",
"s3:PutObjectAcl",
"s3:GetObject",
"s3:GetObjectAcl"
],
"Resource" : "************************"
}
]
}
{
"Version" : "2012-10-17",
"ID" : "",
"Statement" : [
{
"Sid" : "",
"Effect" : "Allow",
"Principar" : "",
"Action" : [
"s3:PutObject",
"s3:PutObjectAcl",
"s3:GetObject",
"s3:GetObjectAcl"
],
"Resource" : "***********************"
}
]
}
and here's the code I used to upload image:
[HttpPost]
public bool UploadFile(string file)
{
var s3Client = new AmazonS3Client(accesskey, secretkey, RegionEndpoint.APSoutheast1);
var fileTransferUtility = new TransferUtility(s3Client);
if (file.Length > 0)
{
var filePath = file;
var fileTransferUtilityRequest = new TransferUtilityUploadRequest
{
BucketName = bucketName,
FilePath = filePath,
StorageClass = S3StorageClass.StandardInfrequentAccess,
PartSize = 6291456, // 6 MB.
Key = keyName,
CannedACL = S3CannedACL.PublicRead
};
fileTransferUtilityRequest.Metadata.Add("param1", "Value1");
fileTransferUtilityRequest.Metadata.Add("param2", "Value2");
fileTransferUtility.Upload(fileTransferUtilityRequest);
fileTransferUtility.Dispose();
}
return true;
}
and getting "The bucket does not allow ACLs" even setting it to "ACLs enabled" in object ownership
#Rutger 's answer is correct, and now it's 2022, aws console has changed ( not a lot ,but some what ), so let me show the images:
1.assume you have created the s3 bucket, in the list page,
2.don't toggle the "block" options
3.find the ownership, then click edit.
4.edit the object owner ship (ACLs enabled)
5.now the edit button for ACL is clickable.
6.toggle the permissions you want and save changes.
it's done, now you can upload images to s3 via commandline and then visit them in your browser:
You should be able to go to the AWS S3 console and navigate to the bucket details for the bucket you try to write objects to. You'll see a tab called 'Permissions'. There you have the option to change the "Object Ownership" at a block with te same title.
Once there, you can choose the option "ACLs enabled".
After applying those changes, you should be able to write objects with ACL options.

Power BI - Create Datasource Endpoint Error

I'm trying to use the Create Datasource REST endpoint from Microsoft's Power BI API, but I'm getting a 400 error when I send the encrypted string of my Windows credentials. My API is written in nodejs, so I'm using node-rsa to replicate the functionality from the C# solution in their docs.
I generated an encrypted string of my credentials with this C# code and it works fine and returns 201 for creating the Datasource:
var credentials = new WindowsCredentials(username: "myusername", password: "mypassword");
var publicKey = new GatewayPublicKey("<exponent>", "<modulus>");
var credentialsEncryptor = new AsymmetricKeyEncryptor(publicKey);
var credentialDetails = new CredentialDetails(credentials, PrivacyLevel.Organizational, EncryptedConnection.Encrypted, credentialsEncryptor);
This is the encrypted string from the Credentials prop of the credentialDetails object in the code above: oavtaHCi2Nkn0euRQxtRvZ2aDPnLA5/T1B6YHHeQzfBDXLJsU4JOx3ZOm31nEWqPRUh2DLARKI0y+C106wgOhWKgxI4nIz5jmJCQqPWRzOYlJlvdPQCOrTtuvIZb3IRy2OAhBfKOcOA8xqHIkKsqvTX1u4LUgfNuwxmU16FB/d3mzZOtc3+ld2MUp15HEPBCen6qTOkdEOwOm2AqxSidx46xoahLrqjkMtLZqLimooVGeTmuXm7bwoRQa1VKUwPaAuKcBBfAuRJulXyeSOMkwKz90pA30y8g7aB2NvT+92SzzjpuHcs1bzPrbPR1J+WXSRTN7xzlCOyQmC5YiQKn6A==AADPIC5L5rjd7WWThtCNAPunINW330ka1ts53SsBkaelIrZ8f9msgE4JjliaeDM91Qq3KDqm1DUjLw6Fg92LzZhMkjcwdEBued/piCdvIpP0eA9rhX0kjWWaMzQF6T3fHr798OkL7yMPEGi+m7Z3cOz22HP2Ot1ORxf6RH/JUNJmLEepEYCRVVubmKL04IyEtx8dShtSG+upeOaBOaD/GOS5
This is my nodejs solution using node-rsa. Having trouble understanding why the encrypted string from the C# code above works, but this one doesn't.
const nodersa = require('node-rsa');
const credentials =
'{"credentialData":[{"name":"username", "value":"myusername"},{"name":"password", "value":"mypassword"}]}';
const exponentString = '<exponent>';
const modulusString = '<modulus>';
const key = new nodersa();
key.setOptions({
encryptionScheme: 'pkcs1_oaep',
});
const pubKey = key.importKey({
n: Buffer.from(modulusString, 'base64'),
e: Buffer.from(exponentString, 'base64'),
});
const encrypted = pubKey.encrypt(credentials, 'base64');
This is the encrypted string from this code above: UCLIIGF8u6HMdlLAwVeZcM0iLks5YLEWhRfpywnNBrAdSoPlP8/vRqe4knMCnAFiSimHiX8fu/CQlAP0b98Xlzx6sHnW4sQHV5KgUErLQfBP5i+5LGj7lBDB0/nsuf2hLDSrXXUb3F+XXv1mlUTgNZzwanizUuRcqYRxcdMaYOjI1vCaQbW++kHXMgOQPjMBj8hrJ1gW1WznS3zCYy6v8oUwEzJtp2aQEP8Pycvx5KwjVdy9KxkB675+TfddauMlz0B+EpQjC1Z+k87uiuFpGZxwA5FAi+4r8ztZ+9/su+8eieCSBK/8ZokAUNcrLmU0sPuDewaojW1MBdxvJiv7YA==
This is the error that Power BI API returns using string encrypted via node solution. Everything else about my request is identical except the encrypted creds, so I'm confident that that is the issue.
{
"error": {
"code": "DM_GWPipeline_UnknownError",
"pbi.error": {
"code": "DM_GWPipeline_UnknownError",
"parameters": {},
"details": [
{
"code": "DM_ErrorDetailNameCode_UnderlyingErrorMessage",
"detail": {
"type": 1,
"value": "The parameter is incorrect.\r\n"
}
},
{
"code": "DM_ErrorDetailNameCode_UnderlyingHResult",
"detail": {
"type": 1,
"value": "-2146893785"
}
}
],
"exceptionCulprit": 1
}
}
}
You can fix this by upgrading to the latest RSA encryption that supports 2048 bits.

Can't Upload a File to QnaMaker Knowledge Base using SDK

I currently have an Azure Function that I would like to have update a QnaMaker Knowledge Base every day or so. Currently everything is connected and working fine, however I can only send Qna Objects (qna pairs) and not urls to files on a website of mine. So in the example I provided below, while it should populate the KB with 2 questions and the file from the url, it only populates the questions.
Currently this is not giving me any kind of error, in fact the response code from my call to the KB comes back as 204. So it it getting through, but still not adding the file to the KB as it should.
NOTE: The file being imported in this example (alice-I.html) is a random one for this demonstration (not mine, for security), but the issue is the same. If I directly add this file to the QnaMaker from the KB site itself it works fine, but it won't update from the Azure Function Code.
Any insights into what is happening would be great.
Content Being Sent To Knowledge Base
string replace_kb = #"{
'qnaList': [
{
'id': 0,
'answer': 'A-1',
'source': 'Custom Editorial',
'questions': [
'Q-1'
],
'metadata': []
},
{
'id': 1,
'answer': 'A-2',
'source': 'Custom Editorial',
'questions': [
'Q-2'
],
'metadata': [
{
'name': 'category',
'value': 'api'
}
]
}
],
'files': [
{
'fileName': 'alice-I.html',
'fileUri': 'https://www.cs.cmu.edu/~rgs/alice-I.html'
}
]
}";
Code Sending Content To Knowledge Base
using (var clientF = new HttpClient())
using (var requestF = new HttpRequestMessage())
{
requestF.Method = HttpMethod.Put;
requestF.RequestUri = new Uri(<your-uri>);
requestF.Content = new StringContent(replace_kb, Encoding.UTF8, "application/json");
requestF.Headers.Add("Ocp-Apim-Subscription-Key", <your-key>);
var responseF = await clientF.SendAsync(requestF);
if (responseF.IsSuccessStatusCode)
{
log.LogInformation("{'result' : 'Success.'}");
log.LogInformation($"------------>{responseF}");
}
else
{
await responseF.Content.ReadAsStringAsync();
log.LogInformation($"------------>{responseF}");
}
}
So I still don't know how to get the above working, but I got it to work a different way. Basically I used the UpdateKbOperationDTO Class listed here: class
This still isn't the perfect solution, but it allows me to update my KB with files using code instead of the interface.
Below is my new code:
QnAMakerClient qnaC = new QnAMakerClient(new ApiKeyServiceClientCredentials(<subscription-key>)) { Endpoint = "https://<your-custom-domain>.cognitiveservices.azure.com"};
log.LogInformation("Delete-->Start");
List<string> toDelete = new List<string>();
toDelete.Add("<my-file>");
var updateDelete = await qnaC.Knowledgebase.UpdateAsync(kbId, new UpdateKbOperationDTO
{
// Create JSON of changes ///
Add = null,
Update = null,
Delete = new UpdateKbOperationDTODelete(null, toDelete)
});
log.LogInformation("Delete-->Done");
log.LogInformation("Add-->Start");
List<FileDTO> toAdd = new List<FileDTO>();
toAdd.Add(new FileDTO("<my-file>", "<url-to-file>"));
var updateAdd = await qnaC.Knowledgebase.UpdateAsync(kbId, new UpdateKbOperationDTO
{
// Create JSON of changes ///
Add = new UpdateKbOperationDTOAdd(null, null, toAdd),
Update = null,
Delete = null
});
log.LogInformation("Add-->Done");

What is a simple/decent way to encrypt cookie in ASP.NET Core without using IDataProtector?

I am trying to understand how to encrypt contents of cookies in ASP.NET Core 2.1.
If I am using the IDataProtector Protect method to encrypt contents of a cookie, I have read that the Unprotect method will fail decryption if website is moved to a different server, running in a server farm, Azure, etc.
If that is true and I don't want to use Redis (or moving keys around), and be able to use a simple method to encrypt contents how would I go about this?
This is my current code (using the IDataProtector):
public static void SetEncryptedCookie()
{
var cookieData = new Dictionary<string, string>()
{
{ "a", "123" },
{ "b", "xyz" },
{ "c", DateTime.Now.ToString() },
{ "UserID", "unique-number-789" }
};
CookieOptions options = new CookieOptions
{
Expires = DateTime.Now.AddDays(90)
,HttpOnly = true
,Secure = true
,IsEssential = true // GDPR
,Domain = ".example.org"
};
// _dataProtector = provider.CreateProtector("somepurpose");
string protectedCookie = _dataProtector.Protect(cookieData);
// IHttpContextAccessor _accessor
// write cookie to http response
_accessor.HttpContext.Response.Cookies.Append("mycookie", protectedCookie, options);
// retrieve encrypted cookie contents.
// ##### this will fail in server farms.
var cookieEnc = _accessor.HttpContext.Request.Cookies["mycookie"];
var cookieDec = _dataProtector.Unprotect(cookieEnc);
}
At public void ConfigureServices(IServiceCollection services) do
DirectoryInfo SharedKeysDataProtectionDirectory = new DirectoryInfo("shared/KeysDataProtection/Directory");
services.AddDataProtection()
.SetApplicationName("WebFarmSharedAppName")
.PersistKeysToFileSystem(SharedKeysDataProtectionDirectory);

Categories

Resources