Is that possible ? GeckoFX can use seperate CookieContainer per instance? - c#

I'm Using Geckfx22.0 and xulrunner22.0. Since GeckoWebBrowser in .Net shares cookies with all other instances of GeckoWebBrowsers I would like for a GeckoWebBrowser to have it's own cookie container which doesn't share any cookies that was created previously in other GeckoWebBrowsers or other instances.
For example when I create a GeckoWebBrowser it shouldn't have any cookies. And when I run 2 instances of GeckoWebBrowser they have their own cookie container and don't share or conflict cookies with each other.
How is that possible?
I've tried various possible ways by creating different class and initiating geckofx but when running different browser at same time it sharing cookies among other browsers. If i remove cookies from one browser , the same happening for other browsers too. I have initiated the proxy and useragent at different times and its works but cant apply various useragents for multiple browsers at the same time.
public void Initiate()
{
Gecko.Xpcom.Initialize(AppDomain.CurrentDomain.BaseDirectory + "/xulrunner");
if (this.IsProxySet)
{
Gecko.GeckoPreferences.User["network.proxy.http"] = this.Host;
Gecko.GeckoPreferences.User["network.proxy.http_port"] = this.Port;
Gecko.GeckoPreferences.User["network.proxy.type"] = 1;
}
if (IsUseragentSet)
{
Gecko.GeckoPreferences.User["general.useragent.override"] = this.Useragent;
}
}
And to remove cookies i'm using following code :
nsICookieManager CookieMan;
CookieMan = Xpcom.GetService<nsICookieManager>("#mozilla.org/cookiemanager;1");
CookieMan = Xpcom.QueryInterface<nsICookieManager>(CookieMan);
CookieMan.RemoveAll();
Help will be appreciated !!!

You could possibly try implementing your own cookie manager that supports this:
see unittest Register_AfterDefaultFactoryHasBeenUnregistered_NewCookieServiceIsUsedInsteadOfDefaultOne
for an example of how to do this.
This code is currently untested and may contain typeos
This code requires a geckofx version newer than v22.0-0.6
[Guid("c375fa80-150f-11d6-a618-0010a401eb10")]
[ContractID(TestCookieServiceFactory.ContractID)]
public class TestCookieServiceFactory
: GenericOneClassNsFactory<TestCookieServiceFactory, TestCookieService>
{
public const string ContractID = "#mozilla.org/cookieService;1";
}
public class TestCookieService : nsICookieService
{
// Implement nsICookieService...
}
public void Main()
{
Xpcom.Initialize("My Xulrunner/Fireofox location");
var existingFactoryDetails = TestCookieServiceFactory.Unregister();
TestCookieServiceFactory.Register();
var browser = new GeckofxWebBrowser();
// Add browser to form etc...
browser.Navigate("http://SomeWebPageThatUsesCookies")
// Cookie requests should now be sent to TestCookieService, process them as your want.
}

Related

Executing AddMembershipRule on SMS_Collection through REST Admin Service

We're trying to use the REST API of Administration Service to manage the Configuration Manager
(What is the administration service in Configuration Manager?)
We have successfully queried entities of different types and executed some custom static methods (i.e. MoveMembers Method on SMS_ObjectContainerItem). It's all mostly blind shots as there is barely any documentation, but those basic functionalities seem to work fine.
Now we have hit the wall, trying to add collection rules to a SMS_Collection (existing or new). This was normally done calling the AddMembershipRule on the instance itself, that was previously fetched by e.g. WqlConnectionManager or some other proxy. However, this is clearly a no-go on a plain object fetched from the REST service.
We have tried to use the wmi OData service (by a generated proxy) as it clearly offers similar functionality, but this ends up with a "Not supported exception":
var savedCollection = Proxy.SMS_Collection.Where(c => c.CollectionID == result.CollectionID).FirstOrDefault();
savedCollection.AddMembershipRule(inclusionRule);
Proxy.UpdateObject(savedCollection);
Proxy.SaveChanges(); //EXCEPTION
I have tried running POST request in numerous ways, using urls like:
SMS_Collection.AddMembershipRule?CollectionID=DV000037 -> 404
SMS_Collection/DV000037/AddMembershipRule -> 404
SMS_Collection.DV000037.AddMembershipRule -> 404
SMS_Collection/DV000037.AddMembershipRule -> treated it as post to SMS_Collection/DV000037, and therefore triggers an update
or just
SMS_Collection.AddMembershipRule with collectionID as param
As for the request body I've used (or just the AddCollectionMembershipRuleRequestRule):
public class AddCollectionMembershipRuleRequest
{
public AddCollectionMembershipRuleRequestRule CollectionRule { get; set; }
}
public class AddCollectionMembershipRuleRequestRule
{
public string RuleName { get; set; }
public string IncludeCollectionID { get; set; }
}
I have also tried to Post an existing or new collection, with CollectionRules prefilled, but this ends up with an exception complaining about IncludeCollectionID not being part of CollectionRule (base class) - looks like validation being too strict and not dealing well with the inheritance.
var collectionRequest = new ECMCollectionCreationRequest()
{
Name = collectionName,
CollectionType = 2,
RefreshType = 4,
LimitToCollectionID = DefaultLimitingCollectionID,
CollectionRules = new List<SMS_CollectionRule>()
{
new SMS_CollectionRuleIncludeCollection()
{
RuleName = "MembershipRule",
IncludeCollectionID = "DV100020"
}
}
};
Stil, no luck with any of those. Do you have any idea if such a scenario (modification of CollectionRules) is even supported with the Rest /OData service? If so, what would be the right way to achieve so?
It looks like this part is simply not supported at the moment. Looking at the code it seems that the service is not interpreting the arguments properly and therefore causing validation issues.
However, the same can be achieved, in a bit less clean and structured way, using ManagementScope and ManagementObject
var scope = new ManagementScope(siteAddress);
scope.Connect();
using (ManagementObject collection = new ManagementObject(scope, new ManagementPath($"SMS_Collection.CollectionID='{collectionID}'"), new ObjectGetOptions()))
{
if (collection == null)
throw new Exception($"Unable to find collection with ID '{collectionID}'");
collection.Get();
using (ManagementBaseObject inParams = collection.GetMethodParameters("AddMembershipRule"))
using (ManagementClass ruleClass = new ManagementClass(scope, new ManagementPath("SMS_CollectionRuleDirect"), new ObjectGetOptions()))
using (ManagementObject rule = ruleClass.CreateInstance())
{
rule["ResourceClassName"] = "SMS_R_System";
rule["ResourceID"] = ecmResourceID;
rule["RuleName"] = machineName;
inParams["collectionRule"] = rule;
collection.InvokeMethod("AddMembershipRule", inParams, null);
}
}
One can add and remove all the other rule types in similar way.
Another alternative is of course to use PowerShell. Sill, I hope that with one of the next iterations of the Administration Service, support of those methods will be added.
Similarly, there seems to be no way to add/remove application or package and import/export them, using the admin services or even in the way mentioned above.
$Rule="{'collectionRule':{`"#odata.type`": `"#AdminService.SMS_CollectionRuleDirect`", `"ResourceClassName`": `"SMS_R_System`", `"ResourceID`": $DeviceID,`"RuleName`": `"$MachineName`"}}"
$RuleCreated = (Invoke-RestMethod -Method Post -Uri "https://$($CMProvider)/AdminService/wmi/SMS_Collection('$CollectionID')/AdminService.AddMembershipRule" -Body $Rule -ContentType 'application/json' -Credential $Cred)

Pin is currently opened in an incompatible sharing mode

I am currently using UWP Toolkit to navigate among app pages. There is a page that is being used for initializing and opening RaspberryPi GPIO pins. The following error occurs after navigating away from that page then trying to navigate back to it again.
The process cannot access the file because it is being used by another process .\r\n\r\nPin ' is currently opened in an incompatible sharing mode. Make sure this pin is not already in use by this application or another application
I can see that the constructor is being called each time the page is visited and hence there is an attempt to open pins that are already opened. What is the best way to overcome this issue?
You may add NavigationCacheMode = NavigationCacheMode.Required; to the ctor of the page so your app will not create a new instance of it when you navigate there.
What I always do is let a class deal with managing pins so that your user code can request pins to act on.
public class IO
{
private readonly GpioController _gpioController;
private readonly Dictionary<int, GpioPin> _pins;
public IO(GpioController gpioController)
{
_gpioController = gpioController;
_pins = new Dictionary<int, GpioPin>();
}
public GpioPin OpenPin(int pin, GpioSharingMode mode)
{
if (_pins.ContainsKey(pin))
{
var gpioPin = _pins[pin];
if (gpioPin.SharingMode == mode)
{
return gpioPin;
}
throw new ArgumentException($"Pin '{pin}' is already configured in mode '{gpioPin.SharingMode}'");
}
else
{
var gpioPin = _gpioController?.OpenPin(pin, mode);
_pins[pin] = gpioPin;
return gpioPin;
}
}
}
Then my viewmodels simply request a pin as follows
public MainViewModel()
{
_io = ServiceContainer.Instance.Get<IO>();
_brakingPin = _io.OpenPin(4, GpioSharingMode.Exclusive);
_io.SetDriveMode(_brakingPin, GpioPinDriveMode.Output);
_io.Write(_brakingPin, GpioPinValue.Low);
}

Abot Crawler Omit CrawledPage HttpWebRequest/Response

I am using Abot in a way that I have a WPF application which displays a browser control (CefSharp).
The user logs in and whichever possible custom authentication the site is using will work while crawling in the same way as if the user were actually browsing the site.
Thus, when I crawl, I want to use this browser control to make the request and simply return the page data.
Therefore I've implemented my custom PageRequester, complete listing below.
The problem is that with CefSharp, as with other browser controls, it's not possible to get the HttpWebRequest/Response associated with a CrawlPage.
Without setting these two properties, Abot does not proceed the crawl further.
Is there something I can do to circumvent this problem?
Code listing:
using Abot.Core;
using Abot.Poco;
using CefSharp.Wpf;
using System;
using System.Net;
using System.Text;
using System.Threading;
public class CefPageRequester : IPageRequester
{
private MainWindowDataContext DataContext;
private ChromiumWebBrowser ChromiumWebBrowser;
private CrawlConfiguration CrawlConfig;
private volatile bool _navigationCompleted;
private string _pageSource;
public CefPageRequester(MainWindowDataContext dataContext, ChromiumWebBrowser chromiumWebBrowser, CrawlConfiguration crawlConfig)
{
this.DataContext = dataContext;
this.ChromiumWebBrowser = chromiumWebBrowser;
this.CrawlConfig = crawlConfig;
this.ChromiumWebBrowser.FrameLoadEnd += ChromiumWebBrowser_FrameLoadEnd;
}
public CrawledPage MakeRequest(Uri uri)
{
return this.MakeRequest(uri, cp => new CrawlDecision() { Allow = true });
}
public CrawledPage MakeRequest(Uri uri, Func<CrawledPage, CrawlDecision> shouldDownloadContent)
{
if (uri == null)
throw new ArgumentNullException("uri");
CrawledPage crawledPage = new CrawledPage(uri);
try
{
//the browser control is bound to the address of the data context,
//if we set the address directly it breaks for some reason, although it's a two way binding.
this.DataContext.Address = uri.AbsolutePath;
crawledPage.RequestStarted = DateTime.Now;
crawledPage.DownloadContentStarted = crawledPage.RequestStarted;
while (!_navigationCompleted)
Thread.CurrentThread.Join(10);
}
catch (WebException e)
{
crawledPage.WebException = e;
}
catch
{
//bad luck, we should log this.
}
finally
{
//TODO must add these properties!!
//crawledPage.HttpWebRequest = request;
//crawledPage.HttpWebResponse = response;
crawledPage.RequestCompleted = DateTime.Now;
crawledPage.DownloadContentCompleted = crawledPage.RequestCompleted;
if (!String.IsNullOrWhiteSpace(_pageSource))
crawledPage.Content = this.GetContent("UTF-8", _pageSource);
_navigationCompleted = false;
_pageSource = null;
}
return crawledPage;
}
private void ChromiumWebBrowser_FrameLoadEnd(object sender, CefSharp.FrameLoadEndEventArgs e)
{
if (!e.IsMainFrame)
return;
this.ChromiumWebBrowser.Dispatcher.BeginInvoke(
(Action)(() =>
{
_pageSource = this.ChromiumWebBrowser.GetSourceAsync().Result;
_navigationCompleted = true;
}));
}
private PageContent GetContent(string charset, string html)
{
PageContent pageContent = new PageContent();
pageContent.Charset = charset;
pageContent.Encoding = this.GetEncoding(charset);
pageContent.Text = html;
pageContent.Bytes = pageContent.Encoding.GetBytes(html);
return pageContent;
}
private Encoding GetEncoding(string charset)
{
Encoding e = Encoding.UTF8;
if (charset != null)
{
try
{
e = Encoding.GetEncoding(charset);
}
catch { }
}
return e;
}
}
The question can also be phrased as: how to avoid having to create a HttpWebResponse from a stream? Which seems impossible, given MSDN says:
You should never directly create an instance of the HttpWebResponse
class. Instead, use the instance returned by a call to
HttpWebRequest.GetResponse.
I would have to actually post the request to get the response, which is precisely what I want to avoid by having a web browser control.
As you are aware, lots of functionality depends on the HttpWebRequest and HttpWebResponse being set. I've ordered a few options for you off the top of my head...
1) Refactor Abot to use some POCO Abstraction instead of those classes. Then just have an converter that converts the real HttpWebRequest and HttpWebResponse to those POCO types as well as a converter that converts your browser objects response into those POCOs.
2) Create a CustomHttpWebRequest and CustomHttpWebResponse that inherit from the .net classes so you can access/override the public/protected properties which may allow you to manually create an instance that models the request/response that your browser component returns to you. I know this can be tricky but may work (I've never done it so I can't say for sure).
3) [I HATE THIS IDEA. It SHOULD BE YOUR LAST RESORT] Create a real instance of these classes and use reflection to set whatever properties/values need to be set to satisfy all of Abot's usages.
4) [I HATE THIS IDEA EVEN WORSE] Use MS Fakes to create shims/stubs/fakes to the properties and methods of the HttpWebRequest and HttpWebResponse. Then you could configure it to return your values. This tool is usually only used for testing but I believe it can be used for production code if you are desperate, don't care about performance and/or are insane.
I also included the terrible ideas as well to just in case they help you spark some thought. Hope that helps...

Caching ASP.NET Web API with CacheCow

I am trying to implement caching using CacheCow. I have two problems:
In some cases I need to invalidate manually the cache of some resources.
For example, I have a resource that it is called purchase, and other that is called pointMovements. They are not totally connected, but doing a post in purchase, implies some changes in pointMovement. Cachecow is not detecting these changes because I am not calling the API of pointmovements. So when I call the endpoint of pointmovements, the values are cached and I cannot get the new values.
To solve this, I need to invalidate that manually, how is that possible?
There are some controllers that I don't want to cache. I am trying to use attributes for doing that but it is not working. I am following this article but the attributes are ignored.
How can I specify which controllers to cache?
I came across the same set of problems and found a solution for problem 2 (disable caching regardless of the default settings).
// This forces the server to not provide any caching by refreshing its cache table immediately (0 sec)
[HttpCacheRefreshPolicy(0)]
// This forces the client (browser) to not cache any data returned from the server (even if ETag is present) by setting the time-out to 0 and no-cache to true.
[HttpCacheControlPolicy(true, 0, true)]
public void MyController : ApiControler {... }
The attributes must be applied together for this to work. You can also control the caching at the action level by providing the same rules to each action.
I've still to figure out the solution for problem 1. but watch this space for updates.
Update
I have found a solution to problem 1.
Register the CachingHandler with your IoC container (in my case it's IUnityContainer)
Inject the ICachingHandler into your Web API controller.
To invalidate the resource, use ICachingHandler.InvalidateResource(HttpRequestMessage)
Please see a code example below. The solution has been tested.
public class Bootstrapper
{
//...
// Create a new caching handler and register it with the container.
public void RegisterCache(HttpConfiguration config, IUnityContainer container)
{
var cachingHandler = new CachingHandler(config);
// ...
container.RegisterInstance<ICachingHandler>(cachingHandler);
}
}
public class ResourceContoller : ApiController
{
private ICachingHandler _cachingHandler;
public ResourceContoller(ICachingHandler cachingHandler)
{
_cachingHandler = cachingHandler;
}
[HttpPost]
public void DeleteResource(int resourceId)
{
// Do the delete
// ...
// Now invalidate the related resource cache entry
// Construct a http request message to the related resource
// HINT: The "DefaultApi" may not be your api route name, so change this to match your route.
// GOTCHA: The route matching mechanism is case sensitive, so be aware!
var relatedResource = new HttpRequestMessage(HttpMethod.Get, Url.Link("DefaultApi", new {controller = "linkedresource", action = "getlinkedresource", id: resourceId}));
// Invalidate the resource with the caching handler.
_cachingHandler.InvalidateResource(relatedResource);
}
}
Sorry for the late response.
As #Tri Q said, the way to do this is to use attributes which I have explained in this blog:
http://byterot.blogspot.co.uk/2013/03/rest-asp-net-wep-api-0.4-new-features-breaking-change-cachecow-server.html
I solved your #1 question using below code. Here I extend the IRoutePatternProvider interface. Remember, what you return in GetRoutePattern should match what you return in GetLinkedRoutePatterns. Only then the adding and removing will work. Try it out.
Inside Application_Start
CachingHandler cacheHandler = new CachingHandler(GlobalConfiguration.Configuration);
cacheHandler.RoutePatternProvider = new CacheRoutePatternProvider();
GlobalConfiguration.Configuration.MessageHandlers.Add(cacheHandler);
Custom Class
public class CacheRoutePatternProvider : IRoutePatternProvider
{
public string GetRoutePattern(HttpRequestMessage request)
{
string path = request.RequestUri.AbsolutePath;
if (!path.EndsWith("/"))
path += "/";
return path;
}
public IEnumerable<string> GetLinkedRoutePatterns(HttpRequestMessage request)
{
string path = request.RequestUri.AbsolutePath;
if(!path.EndsWith("/"))
path += "/";
int segmentIndex;
// return each segment of the resource heirarchy
while ((segmentIndex = path.LastIndexOf("/")) > 0)
{
path = path.Substring(0, segmentIndex);
if(path.Contains("/api/"))
yield return path + "/";
}
yield break;
}
}

Xilium.CefGlue how to execute JavaScript with return Value?

I'm quite new in programming multi-threading and I could not understand from the xelium example how I could execute a javascript and get the return value.
I have tested:
browser.GetMainFrame().ExecuteJavaScript("SetContent('my Text.')", null, 0);
the javascript is executed, but I this function don’t allow me to get the return value.
I should execute the following function to get all the text the user have written in the box..
browser.GetMainFrame().ExecuteJavaScript("getContent('')", null, 0);
the function TryEval should do this…
browser.GetMainFrame().V8Context.TryEval("GetDirtyFlag", out returninformation , out exx);
But this function can’t be called from the browser, I think it must be called from the renderer? How can I do so?
I couldn’t understand the explanations about CefRenderProcessHandler and OnProcessMessageReceived.. How to register a Scriptable Object and set my javascript & parameters?
Thx for any suggestions how I could solve this!
I have been struggling with this as well. I do not think there is a way to do this synchronously...or easily :)
Perhaps what can be done is this:
From browser do sendProcessMessage with all JS information to renderer
process. You can pass all kinds of parameters to this call in a structured way so encapsulating the JS method name and params in order should not be difficult to do.
In renderer process (RenderProcessHandler onProcessMessageReceived method) do TryEval on the V8Context and get the return value via out parameters and sendProcessMessage back to the
browser process with the JS return value (Note that this supports ordinary return semantics from your JS method).You get the browser instance reference in the onProcessMessageReceived so it is as easy as this (mixed pseudo code)
browser.GetMainFrame().CefV8Context.tryEval(js-code,out retValue, out exception);
process retValue;
browser.sendProcessMessage(...);
Browser will get a callback in the WebClient in onProcessMessageReceived.
There is nothing special here in terms of setting up JS. I have for example a loaded html page with a js function in it. It takes a param as input and returns a string. in js-code parameter to TryEval I simply provide this value:
"myJSFunctionName('here I am - input param')"
It is slightly convoluted but seems like a neat workable approach - better than doing ExecuteJavaScript and posting results via XHR on custom handler in my view.
I tried this and it does work quite well indeed....and is not bad as it is all non-blocking. The wiring in the browser process needs to be done to process the response properly.
This can be extended and built into a set of classes to abstract this out for all kinds of calls..
Take a look at the Xilium demo app. Most of the necessary wiring is already there for onProcessMessage - do a global search. Look for
DemoRendererProcessHandler.cs - renderer side this is where you will invoke tryEval
DemoApp.cs - this is browser side, look for sendProcessMessage - this will initiate your JS invocation process.
WebClient.cs - this is browser side. Here you receive messages from renderer with return value from your JS
Cheers.
I resolved this problem by returning the result value from my JavaScript function back to Xilium host application via an ajax call to a custom scheme handler. According to Xilium's author fddima it is the easiest way to do IPC.
You can find an example of how to implement a scheme handler in the Xilium's demo app.
Check out this post: https://groups.google.com/forum/#!topic/cefglue/CziVAo8Ojg4
using System;
using System.Windows.Forms;
using Xilium.CefGlue;
using Xilium.CefGlue.WindowsForms;
namespace CefGlue3
{
public partial class Form1 : Form
{
private CefWebBrowser browser;
public Form1()
{
InitializeCef();
InitializeComponent();
}
private static void InitializeCef()
{
CefRuntime.Load();
CefMainArgs cefArgs = new CefMainArgs(new[] {"--force-renderer-accessibility"});
CefApplication cefApp = new CefApplication();
CefRuntime.ExecuteProcess(cefArgs, cefApp);
CefSettings cefSettings = new CefSettings
{
SingleProcess = false,
MultiThreadedMessageLoop = true,
LogSeverity = CefLogSeverity.ErrorReport,
LogFile = "CefGlue.log",
};
CefRuntime.Initialize(cefArgs, cefSettings, cefApp);
}
private void Form1_Load(object sender, EventArgs e)
{
browser = new CefWebBrowser
{
Visible = true,
//StartUrl = "http://www.google.com",
Dock = DockStyle.Fill,
Parent = this
};
Controls.Add(browser);
browser.BrowserCreated += BrowserOnBrowserCreated;
}
private void BrowserOnBrowserCreated(object sender, EventArgs eventArgs)
{
browser.Browser.GetMainFrame().LoadUrl("http://www.google.com");
}
}
}
using Xilium.CefGlue;
namespace CefGlue3
{
internal sealed class CefApplication : CefApp
{
protected override CefRenderProcessHandler GetRenderProcessHandler()
{
return new RenderProcessHandler();
}
}
internal sealed class RenderProcessHandler : CefRenderProcessHandler
{
protected override void OnWebKitInitialized()
{
CefRuntime.RegisterExtension("testExtension", "var test;if (!test)test = {};(function() {test.myval = 'My Value!';})();", null);
base.OnWebKitInitialized();
}
}
}

Categories

Resources