AuthBot with Azure Table Store - c#

I'm using authbot (AuthBot) to login users on my bot, but, I'm using a large amount of data, and since than, it starts gives me an error that I'm overload (Stackoverflow)
So, I do as suggested, I create an Azure Table Storage, since than, my bot does not recognize the authentication. It seems Authbot cannot get \ set the data from table storage. Do you know something about it?

The current AuthBot uses the default state client. I've submitted a PR to fix this: https://github.com/MicrosoftDX/AuthBot/pull/37/files
In the interim, you can download the AuthBot source and include the changes to OAuthCallbackController in your project.
Edit:
This repo will eventually replace AuthBot: https://github.com/richdizz/BotAuth It is already using the correct state client interfaces.

Related

signature verification failed; please verify account number and chain-id: unauthorized

You don't want to know how many hours I've spent trying to figure this one out.
I'm trying to send a broadcast to the Cosmos Blockchain, via the Terra Pisco/Rebel1 Api.
LUNA2: https://pisco-lcd.terra.dev/swagger/#/
LUNC: https://rebel1.grouptwo.org/swagger/#/
When I run the simulation Api, the transaction works perfectly.
However when I run a transaction just via the standard api, it fails with a "signature verification failed" error.
This issue is related to the signature being passed to Cosmos not containing the correct Chain-Id or Sequence Number. However I've confirmed that the base64 has it. And all the payloads being sent are correct (Protobuf & Rest JSON), according to the Cosmos SDK support team.
I have generated an issue in the cosmos-sdk repo below (it contains all the details, models, dtos, protos, I pass cosmos):
https://github.com/cosmos/cosmos-sdk/issues/14789
I've verified the Protobuf data according to terra-money's terra.js library
https://github.com/terra-money/terra.js. And the data looks correct.
I've also confirmed that the signature contains the correct Chain-Id & Sequence number.
This issue is related to the C# SDK that I'm currently building. It allows users to broadcast Tx data to Terra (Cosmos).
Here's the link, if you want to generate a PR:
https://github.com/TerraMystics/terra-sharp

Is there a way to fetch all DocumentDb resoruces from Azure?

My goal is to fetch all the resources from the azure to our local database. I've been able to fetch various types of resources (virtual machines, websites, storage accounts etc.) using C# SDK (https://github.com/Azure/azure-sdk-for-net). However, it seems there's no wrapper for DocumentDb resource, what's more, I can't event find an endpoint to query it. Do you know how to fetch all DocumentDb resources ?
Try something like this powershell cmdlet
Find-AzureRmResource -ResourceType "Microsoft.DocumentDB/databaseAccounts"
Unfortunately, this doesn't seem to work exactly as expected, which is kind of weird. If I run this
Find-AzureRmResource -ResourceType "Microsoft.Storage/storageAccounts"
It outputs all the storage accounts. Go figure...
edit:
Yes it is working fine. I was looking in the wrong subscription for docDbs... idiot.
As for doing it with REST, #macpak I haven't done it myself, but I suspect you could do something like:
GET https://management.azure.com/subscriptions/yourSubId/resourceGroups/yourResourceGroupName/providers/Microsoft.DocumentDB/databaseAccounts?api-version=2016-03-31
This is how https://resources.azure.com works, check it out. Obviously you parameterise the subId and RGname and build the URI. You would also need to get a bearer token and put it in the authorisation header. Use postman and/or fiddler..

More Azure data requests then I can reconcile through Azure dashboard

I recently deployed our first Azure webapp service and it was a pretty painless experience.
It was just a simple requestbin like api app to store id's fired by a webhook in to an azure data table and another end point to query if that ID is present. This is used to test the webhook in our deployment tests.
Works great, however at most I am expecting the may 60 table requests to hit the storage account a day in write and read pairs
in the last 24hr's I've received 10,23k requests (pretty consistently through the night) as well as queue and blob requests I don't have set up through the API Screenshot of azure data account requests
looking through the storage accounts audit logs I see almost exclusively list key operations with the 'Caller' column blank
audit log
does this mean this is an internal Azure process? Some are me but I would think that was me checking through the dash
The deploy tests themselves aren't live and the DataTable only includes the two initial test entities I inserted during testing so I can't really account for these requests. Rookie mistake I'm sure but any ideas?
Bonus: I use the below block to initialise my data table. it resides in the apiClient classes constructor method on a free tier instance. Does table.createIfNotExists() count as a data transaction and does being present in constructor hammer the call as azure moves across processes on the free tier
_storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
_tableClient = _storageAccount.CreateCloudTableClient();
table = _tableClient.GetTableReference("webhookreceipts");
// Create the table if it doesn't exist.
table.CreateIfNotExists();
thanks
Update:
I have left it running over night again and it appears to have followed the same pattern of cycling around 500 requests per hour through the night as before
Firstly, I would suggest you click Audit log "ListKeys" to see detailed info. Please focus on the properties 'EVENT TIMESTAMP' &'CALLER' to know when triggered and who did it. Secondly, please comment all the code related with Azure table to see the result. Thirdly, please create a new Azure account to see whether have the same issues. Everything works on my side. If this issue still exist, I would suggest you contact with Azure support to get a better help.

RESTful APIs for Azure Analytic Services not working

Alright, I'll be straightfoward here. I successfully called Windows Azure Analytic Services's REST APIs for getting and setting the settings for Blob logging and metrics.
However, when I give it a go for tables and queues, I get the following error message:
AuthenticationFailed Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:9d4436e0-9367-46ed-9967-b3ebe888d2f8 Time:2012-01-16T09:20:09.5141262Z
The string I use to sign is the following:
GET\n\n\n\n\n\n\n\n\n\n\n\nx-ms-date:Mon, 16 Jan 2012 09:04:50 GMT\nx-ms-version:2011-0818\n/<accountname>/\ncomp:properties\nrestype:service. It works perfectly fine for Blobs.
The most troublesome thing is that I am not getting an AuthenticationErrorDetail in my response from Analytic Services. When I tried calling the settings REST APIs for Blobs, I actually got a AuthenticationErrorDetail that told me what string the server used to sign. That really helped me construct the above.
Has anyone else gone through something similar?
I realised that my REST calls worked for queues too. It did not work for tables, however.
http://msdn.microsoft.com/en-us/library/windowsazure/dd179428.aspx offered more information:
2009-09-19 Shared Key Lite and Table Service Format
This format supports Shared Key and Shared Key Lite for all versions
of the Table service, and Shared Key Lite for the 2009-09-19 version
of the Blob and Queue services. This format is identical to that used
with previous versions of the storage services. Construct the
CanonicalizedResource string in this format as follows:
Beginning with an empty string (""), append a forward slash (/),
followed by the name of the account that owns the resource being
accessed.
Append the resource's encoded URI path. If the request URI addresses a
component of the resource, append the appropriate query string. The
query string should include the question mark and the comp parameter
(for example, ?comp=metadata). No other parameters should be included
on the query string.
In the end, it accepted the path ?comp=properties.
I encountered similar problems - blobs working fine, tables not working - when I incorrectly used DateTime.Now instead of DateTime.UtcNow for x-ms-date header

Adding Windows User Name to database

ASP.NET MVC3 newb here.
I am working on a MVC3 intranet application that uses windows authentication. Windows auth is setup and ready to go no problem.
What I need to happen is to capture the username (Domain\first.last) whenever a user accesses the app for the first time and then store that information into a database and associate that name to a another unique identifier (numeric and auto-incremented).
I already found a way to get the username:
string user = WindowsIdentity.GetCurrent().Name;
What I am having an issue with is taking that variable and storing it in my database.
Any suggestions, hints, tips or nudges towards helpful resources are greatly appreciated.
Apologies if this scenario was posted elsewhere, if it was then I was unable to locate it.
Cheers Guys!
Be careful - user names and display names can change. I would avoid storing them in the database.
Instead, look at storing the SID (id of the user). The User property of the WindowsIdentity returns the SID. You can store and update the user name for display purposes but don't rely on it for typing the authenticating user back to the previous user in your DB.
See this SO post as well:
How can I get the SID of the current Windows account?
Persist the SID (along with username for display only) and look up via SID.
I think what you are looking for here is really 'how to store some info in a database'
What database system?
Check out
http://www.datasprings.com/resources/articles-information/a-quick-guide-to-using-entity-framework
You can easily use the entity framework to store that value in the database which is what I think your question was really about. I do agree with Bryanmac though, you should be storing the SID not the login name in case the name changes. If you know it will NEVER change then you could store the name, but since they can technically change in Windows, I'd be aware of this.
If you have a specific question then on how to store that field using the Entity Framework, please post that.
When you create your MVC3 application, there is an option for "Intranet Application" that has all of the Windows Authentication stuff working already, you might want to check that out and pull over pieces of the code for your current project (or migrate what you have depending on how far you are).
It has some special code placed into Web.Config as well as the extra files it creates.
Read more here:
http://weblogs.asp.net/gunnarpeipman/archive/2011/04/14/asp-net-mvc-3-intranet-application-template.aspx
and here:
http://msdn.microsoft.com/en-us/library/gg703322%28VS.98%29.aspx
Also you'll want to use User.Identity.Name to reference the person viewing the website.

Categories

Resources