I am integrating IronPython scripts to run under a C# engine. The C# engine builds a dictionary of 'ScriptValue' objects and passes it to the IronPython Script which then uses the objects to do calculations. The 'ScriptValue' object is in a separate class library and implements 'MarshalByRefObject' and is a simple .net object (stores double and bool values only). The script run happens frequently.
First Attempt:
I instantiated the IronPython engine and ran the scripts. As the runs progress I could see that memory usage was increasing at a fast rate. Eventually after a day or running the application crashed with an out of memory exception. I tried both keeping an instance of the IronPythonEngine alive and restarting a new instance on each run. I also tried shutting down the IronPython Engine but memory would increase consistently.
Second Attempt:
After researching this a lot, suggestions came up to try running the engine in a separate AppDomain and unloading the AppDomain once you are done running the scripts. I then implemented this and create a new AppDomain and unload it once a run has completed. This appears to help to a certain degree but the memory leak persists albeit it creeps up at a slower rate.
I did various memory profiling and it seems like IronPython or somewhere in DLR land unmanaged memory is not getting freed and this creeps up overtime. Managed memory seems to be getting cleared as the AppDomain is unloaded.
The C# engine itself is rather complex and interacts with MS SQL, IronPython, a Data Historian and an Asset Database. I won't go into the specifics of this as I have been able to recreate the issue by taking out all the additional components into a simple Windows Forms application.
The code I am running at the moment under a timer is:
private void RunEngine()
{
ScriptEngine pythonEngine = null;
AppDomain sandbox = null;
ScriptSource source = null;
ScriptScope scope = null;
dynamic subClass = null;
ObjectOperations ops = null;
dynamic instance = null;
dynamic result = null;
Dictionary<string, ScriptValue> scriptInputValues = GetIronPythonScriptInputAttributeValues();
Dictionary<string, ScriptValue> scriptOutputValues = GetIronPythonScriptOutputAttributes();
// Setup PythonEngine options
Dictionary<string, object> options = new Dictionary<string, object>();
//options["Debug"] = ScriptingRuntimeHelpers.True;
options["ExceptionDetail"] = ScriptingRuntimeHelpers.True;
options["ShowClrExceptions"] = ScriptingRuntimeHelpers.True;
// Create a sandbox to run the IronPython scripts in
sandbox = AppDomain.CreateDomain("IronPythonSandbox",
AppDomain.CurrentDomain.Evidence,
new AppDomainSetup() { ApplicationBase = AppDomain.CurrentDomain.BaseDirectory, ApplicationName = "IronPythonSandbox" },
new PermissionSet(PermissionState.Unrestricted));
// Create the python engine
pythonEngine = Python.CreateEngine(sandbox, options);
source = pythonEngine.CreateScriptSourceFromFile(#"\\server2\Projects\Customer\Development\Scripts\calculation.py");
var compiled = source.Compile();
scope = pythonEngine.CreateScope();
//source.Execute(scope);
compiled.Execute(scope);
subClass = scope.GetVariableHandle("Calculate");
ops = pythonEngine.Operations;
instance = ops.Invoke(subClass, scriptInputValues, scriptOutputValues);
result = instance.Unwrap();
if (scriptInputValues?.Count > 0) { scriptInputValues.Clear(); scriptInputValues = null; }
if (scriptOutputValues?.Count > 0) { scriptOutputValues.Clear(); scriptOutputValues = null; }
result = null;
instance = null;
ops = null;
subClass = null;
scope = null;
source = null;
pythonEngine?.Runtime?.Shutdown();
pythonEngine = null;
if (sandbox != null) { AppDomain.Unload(sandbox); }
sandbox = null;
}
I have stripped down the script into bare bones now to test the memory issue and it is like this and does not carry out any actual calculations as such.
import clr
import sys
# Import integration library to allow for access to the required .Net object types
sys.path.append(r"C:\Program Files\Company\RTCM Worker") # Include the path to the .Net Library
clr.AddReference('RTCM.Worker.IPy.Integration.Library.dll')
import RTCM.Worker.IPy.Integration.Library
import System
from System.Collections.Generic import Dictionary
sys.path.append(r"\\server2\Projects\Customer\Development\Scripts") # Include the path to the module
from constants import *
from sharedfunctions import *
import math
def Calculate(scriptInputValues, scriptOutputValues):
returnValue = True
try:
# Parameter validations
if returnValue: # Only proceed with the calculation if all inputs are valid
## Script logging related objects
#ENABLE_SCRIPTLOGGING = scriptOutputValues[C_EnableScriptLogging].Value
#SCRIPT_LOG = scriptOutputValues[C_ScriptLog].Value
# Get all the required input parameter values
AMB_TEMP = scriptInputValues[C_AmbientTemperature].Value
GND_AIR = scriptInputValues[C_GroundAir].Value
MAX_DESIGN_TEMP = scriptInputValues[C_MaximumDesignTemperature].Value
g = scriptInputValues[C_RatingCalculationConstants_g].Value
CONDUCTOR_DIA = scriptInputValues[C_ConductorDIA].Value
WIND_SPEED = scriptInputValues[C_WindSpeed].Value # From lookup table and no conversion needed as this is in m/s
DEFAULT_WIND_ANGLE = scriptInputValues[C_WindBearing].Value
SIGMA = scriptInputValues[C_Rating_Calculation_Constants_SIGMA].Value
CONDUCTOR_EMISSIVITY = scriptInputValues[C_ConductorEmissivity].Value
SOLAR_ABSORPTION = scriptInputValues[C_SolarAbsorption].Value
SOLAR_DIRECT = scriptInputValues[C_SolarDirect].Value
GROUND_REFLECTIVITY = scriptInputValues[C_GroundReflectivity].Value
SOLAR_DIFFUSE = scriptInputValues[C_SolarDiffuse].Value
CONDUCTOR_SKIN_EFFECT = scriptInputValues[C_ConductorSkinEffect].Value
CONDUCTOR_MAG_EFFECT = scriptInputValues[C_ConductorMAGEffect].Value
CONDUCTOR_DC_RESISTANCE = scriptInputValues[C_ConductorDCResistance].Value
CONDUCTOR_ALPHA = scriptInputValues[C_ConductorAlpha].Value
# Destroy all referenced objects
del AMB_TEMP
del GND_AIR
del MAX_DESIGN_TEMP
del g
del CONDUCTOR_DIA
del WIND_SPEED
del DEFAULT_WIND_ANGLE
del SIGMA
del CONDUCTOR_EMISSIVITY
del SOLAR_ABSORPTION
del SOLAR_DIRECT
del GROUND_REFLECTIVITY
del SOLAR_DIFFUSE
del CONDUCTOR_SKIN_EFFECT
del CONDUCTOR_MAG_EFFECT
del CONDUCTOR_DC_RESISTANCE
del CONDUCTOR_ALPHA
del scriptInputValues
del scriptOutputValues
returnValue = True
except System.Exception as ex:
returnValue = False
return returnValue;
Some screenshots of memory creeping up over time and you will notice unmanaged memory is creeping up:
Start of run
some time later
some time later
I am running out of options now. Are there any other suggestions on things to try?
A few other things I tried:
Setting LightweightScopes to true and it did not help.
Deleting objects referenced in the IronPython script using the del keyword and it did not help.
Let me know if you want to know any additional details around my setup.
I had the exact same issue with memory leakage per execution of an IronPython 2.7.5 script within a C# engine.
You should manually disconnect from your MarshalByRef objects at the end of each script execution or else you may be holding on to objects. If within your MarshalByRef object(s), you have overriden the InitializeLifetimeService to prevent remoting exceptions, it is mandatory to manually disconnect as shown here:
System.Runtime.Remoting.RemotingServices.Disconnect(MarshalByRefObj obj)
Hopefully you've had success with removing Debug options from the engine, I'd like to know if that worked for you.
Related
I'm writing a C# console app (.NET 6) running on a Windows machine that processes XSLT transformations as a batch: It reads parameter sets (which are then passed as params to the respective stylesheet) from an XML file and performs transformations using SaxonCS. (The params always contain the initial template and a path to the source file which is being read into a variable via doc($path-to-source) in said initial template).
For each transformation, an object is instantiated that represents an XSL transformation. Among other things like Console output etc., regarding the transformation, it does this:
// Instantiate SaxonCS Processor processor = new();
XsltCompiler compiler = processor.NewXsltCompiler();
compiler.BaseUri = new Uri(pathToXsl);
XsltTransformer transformer = compiler.Compile(File.OpenRead(pathToXsl)).Load();
// Set params used by the stylesheet, using ExternalParams (a Dictionary populated earlier
// with values read from the configuration file).
foreach (var parameter in ExternalParams) { transformer.SetParameter(parameter.Key, parameter.Value); }
transformer.InitialTemplate = new QName(parametrizedInitialTemplate);
// perform transformation
XdmDestination result = new(); // effectively unused
transformer.Run(result);
transformer.Close();
result = null; // result is not needed: The processor already serialized it because of xsl:result-document()
This works great - until I try to use a file which is the result of an earlier transformation, having been written using xsl:result-document href="{filepath}" in the stylesheet, as the input for another transformation later on. This then gives me a
System.IO.IOException: 'The process cannot access the file '..(some file path)..' because it is being used by another process.'
In other words:
This works: <transform_1: A --> B>; <transform_2: J --> K>; <transform_3: X --> Y>.
This fails: <transform_1: A --> B>; <transform_2: B --> C> (because B can't be accessed).
So I somehow fail to release resources / to close the resulting output file once the transformation is finished.
Running the exact same configuration file and console app, but calling Saxon 9-HE (Java) in its own Process works perfectly fine (but very slowly); there are no file access problems then:
Process proc = new();
ProcessStartInfo startInfo = new()
{
Arguments = #$"-cp {saxonJarPath} net.sf.saxon.Transform {string.Join(" ", XslParams)} -xsl:{XslPath} -it:{InitialTemplate}",
FileName = "java",
RedirectStandardOutput = true,
RedirectStandardError = true,
UseShellExecute = false,
CreateNoWindow = true
};
proc.StartInfo = startInfo;
proc.Start();
proc.WaitForExit();
This is obviously not an ideal solution, and I would really like to speed the whole thing up and get rid of the Java dependency - that's why I'm trying to make it work with SaxonCS.
Unfortunately, I can't do "real pipelining" (as shown in one of the code examples that come with Saxon), thus directly using the result as input for the next, because the whole thing has to be configured externally (and not every result is input for the next transformation).
(Because of the explanations regarding ResultDocumentHandler, I tried processor.SetProperty(Saxon.Api.Feature.ALLOW_MULTITHREADING, false);, but it didn't help.)
So: How do I prevent a file which has been produced via xsl:result-document from being locked even when the transformation has finished?
I think explicitly setting up
transformer.BaseOutputUri = new Uri(pathToXsl);
transformer.ResultDocumentHandler = (href, baseUri) => {
var serializer = processor.NewSerializer();
var fs = File.Open(new Uri(baseUri, href).LocalPath, FileMode.Create, FileAccess.Write);
serializer.OutputStream = fs;
serializer.OnClose(() => { fs.Close(); });
return serializer;
};
avoids the problem.
I have a question that I can't handle for a week, you won't believe it, I've already checked everything that is possible, but it still doesn't want to work correctly. Now I'm writing a script system for my game engine and I need to dynamically recomile/reload .the dll file that contains my library and scripts. To do this, I do:
Initialize Mono. (I get an appdomain (MonoDomain*) already created for me)
Open an assembly file. (dll)
Get an image from this assembly.
Use mono (I load internal calls, create C# objects, call functions, etc.).
But at some point I need to recompile/reload this assembly, because I change the script code, well, it's understandable. As I understand from a lot of read documentation and various sources, I do not have to describe the domain, assemblies and images while it is active and for this you first need:
Create a new domain.
Upload the image.
Connect the image to the object assembly (thereby making the image status OK)
And only then describe the resources from the last domain.
But just so that I don't try, either an error occurs in the mono code, or now I have reached the stage that there are no errors, but here is a pointer to a new image obtained from a newly loaded assembly (with the same name) is the same and thus it turns out that mono continues to use the same image and does not have the updated script values.
This is code of my Mono init:
mono_set_dirs(".", ".");
// 0. Load first domain, init mono + appdomain + load assembly.
const char* domainName = "ForceCS";
const char* assemblyName = "ForceCS.dll";
pMonoDomain = mono_jit_init(domainName);
if (pMonoDomain) {
pMonoAssembly = mono_domain_assembly_open(pMonoDomain, assemblyName);
if (pMonoAssembly) {
pMonoImage = mono_assembly_get_image(pMonoAssembly);
if (pMonoImage) {
AddInternalFunctionsAndScriptClasses(); // mono_add_internal_call ...
}
}
}
Then i do some stuff with my dll like create objects or call methods as i said before. And then try to reload the ForceCS.dll.
// 1. Load new domain (Works fine).
char* newDomainName = "ForceCS";
MonoDomain* newDomain = mono_domain_create_appdomain(newDomainName, NULL);
if (!mono_domain_set(newDomain, false)) {
// Cannot set domain!
}
// 2. Load image. (Works fine)
const char* dllFile = "C:\\Force\\Editor\\ForceCS.dll";
MonoImageOpenStatus status;
MonoImage* newImage = mono_image_open(dllFile, &status); ** Problem here: Shows the same address as before and that means that Mono use same .dll.**
// 3. Load new assembly. (Works fine).
MonoAssembly* assembly = mono_assembly_load_from(newImage, mono_image_get_name(newImage), &status);
if (status != MonoImageOpenStatus::MONO_IMAGE_OK) {
///Cannot load assembly!
}
assembly = mono_assembly_open(dllFile, &status);
newImage = mono_assembly_get_image(assembly);
if (newImage) {
AddInternalFunctionsAndScriptClasses();
}
// 4. Inload old domain.
MonoDomain* domainToUnload = pMonoDomain == nullptr ? mono_domain_get() : pMonoDomain;
if (domainToUnload && domainToUnload != mono_get_root_domain()) {
mono_domain_unload(domainToUnload);
mono_gc_collect(mono_gc_max_generation());
mono_domain_finalize(domainToUnload, 2000);
mono_gc_collect(mono_gc_max_generation());
pMonoDomain = newDomain;
pMonoImage = newImage;
pMonoAssembly = assembly;
}
Final problem that image always stays the same, only it works if i load this assembly with different name like ForceCS1.dll, but its not what i want. Please explain me:
When i need to close/free domains, assemblies, images.
What the connection between assembly-image.
How reload my .dll assembly.
I will be grateful for every answer.
Finally, after another week of torment, I finally managed to fix it. In general, as I understood the problem was that I was unloading my only main domain, but I did not create/set to another domain for mono so that he could access it on another thread while the assembly is being reloaded.
In general, I completely rewrote the code, I got something like:
int main() {
// Init Mono Runtime. (Create Root Domain)
mono_init();
if (mono_create_domain_and_open_assembly("Scripting Domain", "MyScript.dll"))
mono_invoke("Script", "OnCreate", 0, NULL, NULL);
while (true) {
bool is_reload_request;
std::cin >> is_reload_request;
if (is_reload_request) {
mono_destroy_domain_with_assisisated_assembly(domain);
// Update new compiled assembly.
mono_update_assembly("C:\\MonoProject\\MyScript.dll", "C:\\MonoProjectC#\\Binary\\Debug-windows-x86_64\\MyScript.dll");
if (mono_create_domain_and_open_assembly("Scripting Domain", "MyScript.dll"))
mono_invoke("Script", "OnCreate", 0, NULL, NULL);
}
}
mono_jit_cleanup(root_domain);
return 0;
}
Now I want to explain in detail what I did, for those who may also have encountered such a problem and cannot understand what and how and how to fix it.
Init Mono & create root domain.
The first thing I did was initialize mono runtime as before, thereby creating a root domain so that in case of a reloading we could switch to it.
void mono_init() {
mono_set_dirs(".", ".");
root_domain = mono_jit_init_version("Root Domain", "v4.0.30319");
}
Creating Scripting domain, loading assembly.
Then I created a new domain in our case, this is the Scripting Domain, and after creating it, I set it for mono (mono_domain_set(domain, 0)) so that all subsequent actions are performed from it. And then, just like before, I opened the my assembly MyScript.dll through the domain and got pointers to the assembly and the image.
bool mono_create_domain_and_open_assembly(char* domain_name, const char* assembly_file) {
domain = mono_domain_create_appdomain(domain_name, NULL);
if (domain) {
mono_domain_set(domain, 0);
// Open assembly.
assembly = mono_domain_assembly_open(mono_domain_get(), assembly_file);
if (assembly) {
image = mono_assembly_get_image(assembly);
if (image)
return true;
}
}
return false;
}
After this stage, I checked that my assembly was working fine, just created a Script object and called the OnCreate method from it. I will not write this code so I hope those people who have already worked with mono know how to do it.
Reloading
Now the reloading itself. In fact, everything is much simpler here than it seems, just before calling mono_domain_upload, you need to call mono_domain_set to switch our Scripting Domain to Root Domain for mono when we will creating new domain (new dll).
bool mono_destory_domain_with_assisisated_assembly(MonoDomain* domain_to_destroy) {
if (domain_to_destroy != root_domain) {
mono_domain_set(root_domain, 0);
mono_domain_unload(domain_to_destroy);
return true;
}
return false;
}
And then compile our C# project and move it to our project (or where you specified the paths for mono). Or how I did it through C++:
void mono_update_assembly(char* assembly_path, char* assembly_path_from) {
if (std::filesystem::exists(assembly_path_from)) {
if (std::filesystem::exists(assembly_path))
std::filesystem::remove(assembly_path);
std::filesystem::copy(assembly_path_from, assembly_path);
}
}
And the last step after all that has been done, we just create new domain and load new assembly.
I am trying to write a VSIX for Visual Studio 2019 that controls multiple instances of the Visual Studio IDE. We are working on a networked project that requires some automation to perform testing of multiple users. In the past I would have used DTE in an external tool, but my understanding is that as of VS2017 the COM guids are no longer globally registered, so doing it within the IDE is the only way.
Regardless, I am trying to get the IVsDebugger so I can track events in the debugger. However, I am having no luck. I can get IVsDebugger2, 3, 4, 5 but not IVSDebugger. Here is the general flow of what I am doing:
void CaptureDebugger()
{
DTE dte = GetDTE(GetRemoteProcessID());
ServiceProvider sp = new ServiceProvider((Microsoft.VisualStudio.OLE.Interop.IServiceProvider)dte);
IVsDebugger vsDebugger = sp.GetService(typeof(SVsShellDebugger)) as IVsDebugger;
// vsDebugger is null!
IVsDebugger2 vsDebugger2 = sp.GetService(typeof(SVsShellDebugger)) as IVsDebugger2;
// vsDebugger2 is not null!
}
/// <summary>
/// Gets the DTE object from any devenv process.
/// </summary>
private static EnvDTE.DTE GetDTE(int processId)
{
object runningObject = null;
IBindCtx bindCtx = null;
IRunningObjectTable rot = null;
IEnumMoniker enumMonikers = null;
try
{
Marshal.ThrowExceptionForHR(CreateBindCtx(reserved: 0, ppbc: out bindCtx));
bindCtx.GetRunningObjectTable(out rot);
rot.EnumRunning(out enumMonikers);
IMoniker[] moniker = new IMoniker[1];
IntPtr numberFetched = IntPtr.Zero;
while (enumMonikers.Next(1, moniker, numberFetched) == 0)
{
IMoniker runningObjectMoniker = moniker[0];
string name = null;
try
{
if (runningObjectMoniker != null)
{
runningObjectMoniker.GetDisplayName(bindCtx, null, out name);
}
}
catch (UnauthorizedAccessException)
{
// Do nothing, there is something in the ROT that we do not have access to.
}
Regex monikerRegex = new Regex(#"!VisualStudio.DTE\.\d+\.\d+\:" + processId, RegexOptions.IgnoreCase);
if (!string.IsNullOrEmpty(name) && monikerRegex.IsMatch(name))
{
Marshal.ThrowExceptionForHR(rot.GetObject(runningObjectMoniker, out runningObject));
}
}
}
finally
{
if (enumMonikers != null)
Marshal.ReleaseComObject(enumMonikers);
if (rot != null)
Marshal.ReleaseComObject(rot);
if (bindCtx != null)
Marshal.ReleaseComObject(bindCtx);
}
return runningObject as EnvDTE.DTE;
}
What confuses me is I get get the local IVsDebugger via the call
var MYDEBUGGER = Package.GetGlobalService(typeof(SVsShellDebugger)) as IVsDebugger;
Which I see is using a GlobalService. I don't think there is an equivalent in the DTE I retrieve.
Any insight?
I ran into this issue as well (however in my case, I'm actually trying to retrieve the IVsDebugger in proc rather than what sounds like out of proc); after debugging into how vsdebug!CDebugger::QueryInterface works I determined the actual issue appears to be that the calling thread in your application needs to be STA.
When the calling thread in your application is MTA, while vsdebug!CDebugger::QueryInterface returns with HRESULT 0
This shortly gets turned into 0x80040155 (REGDB_E_IIDNOTREG) by OLE due to CStdWrapper::GetPSFactory failing to find a proxy DLL for this type
This error in turn gets converted by CRemoteUnknown::RemQueryInterface to 0x80004002 (E_NOINTERFACE)
Which is what is reported back to you if you try and Marshal.QueryInterface in C# to see what's going on directly.
If your program contains in-proc components that live inside the remote Visual Studio process (as mine does) you can retrieve and execute your operations against the IVsDebugger on the UI thread. Otherwise, you can potentially create a new Thread and call thread.SetApartmentState(ApartmentState.STA) on it prior to starting it
I have been using the documentation here https://support.microsoft.com/en-gb/help/304655/how-to-programmatically-compile-code-using-c-compiler
I am trying to learn about compilers a bit more I want to host on my own site a simple text editor that I can use to run the code of a script say something simple like
The program is required to Print out Console.WriteLine("Hello World");
If anything other than Hello World is printed out the program would be in error.
I have been looking at Microsoft code on running .net code at runtime but both these force it to create an exe I want the result to be like .net fiddle in a text box.
I presume what I have to do some how is run the exe and use the process to return the result bare in mind this is inside a mvc applicaiton.
Or is their any cool nugets that can save me the time here.
private void Compiler(string code)
{
CSharpCodeProvider codeProvider = new CSharpCodeProvider();
ICodeCompiler icc = codeProvider.CreateCompiler();
string Output = "Out.exe";
System.CodeDom.Compiler.CompilerParameters parameters = new
CompilerParameters();
//Make sure we generate an EXE, not a DLL
parameters.GenerateExecutable = true;
parameters.OutputAssembly = Output;
CompilerResults results = icc.CompileAssemblyFromSource(parameters, code);
if (results.Errors.Count > 0)
{
foreach (CompilerError CompErr in results.Errors)
{
CompilerError error = new CompilerError();
error.Line = CompErr.Line;
error.ErrorNumber = CompErr.ErrorNumber;
error.ErrorText = CompErr.ErrorText;
}
}
else
{
//Successful Compile
CodeResult result = new CodeResult();
result.Message = "Success";
}
}
So how would one capture the above and return and also how does one add support for other languages like python or vb.net
Is this something that blazor could perhaps be good at doing for me ?
I am wanting to provide an experience like .net fiddle https://dotnetfiddle.net
Suchiman / Robin Sue is has integrated the Monaco editor as well as an in-browser C# compiler in this nifty blazor project (live demo)
When running this program I keep receiving the error:
An unhandled exception of type 'System.Security.SecurityException' occured
Additional Information: ECall methods must be packaged into a system module.
class Program{
public static void Main()
{
Brekel_ProBody2_TCP_Streamer s = new Brekel_ProBody2_TCP_Streamer();
s.Start();
s.Update();
s.OnDisable();
}
}
How can I fix this?
The important part of the Brekel library is as follows:
//======================================
// Connect to Brekel TCP network socket
//======================================
private bool Connect()
{
// try to connect to the Brekel Kinect Pro Body TCP network streaming port
try
{
// instantiate new TcpClient
client = new TcpClient(host, port);
// Start an asynchronous read invoking DoRead to avoid lagging the user interface.
client.GetStream().BeginRead(readBuffer, 0, READ_BUFFER_SIZE, new AsyncCallback(FetchFrame), null);
Debug.Log("Connected to Brekel Kinect Pro Body v2");
return true;
}
catch (Exception ex)
{
Debug.Log("Error, can't connect to Brekel Kinect Pro Body v2!" + ex.ToString());
return false;
}
}
//===========================================
// Disconnect from Brekel TCP network socket
//===========================================
private void Disconnect()
{
if (client != null)
client.Close();
Debug.Log("Disconnected from Brekel Kinect Pro Body v2");
}
public void Update()
{
// only update if connected and currently not updating the data
if (isConnected && !readingFromNetwork)
{
// find body closest to the sensor
closest_skeleton_ID = -1;
closest_skeleton_distance = 9999999f;
for (int bodyID = 0; bodyID < skeletons.GetLength(0); bodyID++)
{
if (!skeletons[bodyID].isTracked)
continue;
if (skeletons[bodyID].joints[(int)brekelJoint.waist].position_local.z < closest_skeleton_distance)
{
closest_skeleton_ID = bodyID;
closest_skeleton_distance = skeletons[bodyID].joints[(int)brekelJoint.waist].position_local.z;
}
}
// apply to transforms (cannot be done in FetchFrame, only in Update thread)
for (int bodyID = 0; bodyID < skeletons.GetLength(0); bodyID++)
{
for (int jointID = 0; jointID < skeletons[bodyID].joints.GetLength(0); jointID++)
{
// only apply if transform is defined
if (skeletons[bodyID].joints[jointID].transform != null)
{
// apply position only for waist joint
if (jointID == (int)brekelJoint.waist)
skeletons[bodyID].joints[jointID].transform.localPosition = skeletons[bodyID].joints[jointID].position_local;
// always apply rotation
skeletons[bodyID].joints[jointID].transform.localRotation = skeletons[bodyID].joints[jointID].rotation_local;
}
}
}
It appears you are using a Unity library but trying to run it as a standalone application?
This error means you are calling a method that is implemented within the Unity engine. You can only use the library from within Unity.
If you want to use it standalone, you'll need to compile the library without referencing any Unity libraries, which probably means you'll need to provide implementations for anything the library is using (such as MonoBehaviour
http://forum.unity3d.com/threads/c-error-ecall-methods-must-be-packaged-into-a-system-module.199361/
http://forum.unity3d.com/threads/security-exception-ecall-methods-must-be-packaged-into-a-system-module.98230/
Additionally,
If your only problem is Debug.Log() throwing an exception, you could use reflection to plant your own Logger instance instead of Unity's one.
Step 1: Create "MyLogHandler" that will do your actual logging(write to file or to console or do nothing). Your class needs to implement "ILogHandler" interface.
Step 2: Replace unity default one with new one.
var newLogger = new Logger(new MyLogHandler());
var fieldInfo = typeof(Debug).GetField("s_Logger", BindingFlags.GetField | BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Static);
fieldInfo.SetValue(null, newLogger);
Note: Keep in mind that reflection accesses field by name and if Unity decide to change it in future, you will not get compile error - exception will be thrown in run-time.
I know this is old, but I came across a way to unit test Unity assemblies from within Visual Studio just by toggling a symbol definition from the Unity build settings. As long as you're OK with only being either able to either run tests or have the testable components usable in unity at one time, you can toggle unit testing mode and unity mode like this (images follow):
Make your unity component a partial class. Have one file where you declare that the partial class extends MonoBehaviour and put any stuff that has to actually use unity assemblies in there. This will not be tested by the unit tests but everything else will.
Use conditional compilation to make that file's contents only compile when a specific symbol is defined during the build. I used UNIT_TEST_NO_UNITY_INTEGRATION in my case.
When you want to run the unit tests from Visual Studio, update the build settings to define that symbol. This will exclude the Unity specific stuff from step 1 from the build and allow Visual Studio to be able to run your unit tests.
When you are finished testing, edit the build settings again and remove that symbol definition. Now your unit tests won't be able to run but your assemblies will work within Unity again.