I've noticed a strange behaviour on Microsoft Edge and an ajax request with jQuery.
In a page I have this code
$("#btnSend").click(function () {
$("#sending").show();
$.ajax({
type: 'GET',
url: '/Report/SaveQuoteOnFile?quote=18',
crossDomain: true,
success: function (msg) {
if (msg == 'True') {
alert('Email sent to the client');
}
$("#sending").hide();
},
error: function (request, status, error) {
$("#sending").hide();
}
});
});
The WebApi is build in C# and the code is something like
public bool SaveQuoteOnFile(int quote)
{
bool rtn = false;
Response.Headers.Add("Access-Control-Allow-Methods", "GET, POST");
Response.Headers.Add("Access-Control-Allow-Headers", "accept, authority");
Response.Headers.Add("Access-Control-Allow-Credentials", "true");
// code to send an email
// if the email is sent rtn = true
...
return rtn;
}
When I open the page with Chrome and click on the button btnSend, Chrome calls the WebAPI, the email is sent, the result is true and I show a message. If I insert a breakpoint in the WebApi, I can follow the code step by step. I'm assuming is always the same.
Now I open the same open with Microsoft Edge and click on the button btnSend, it seems calling the WebApi, the result is true and I show a message but the email isn't sent. The same breakpoint doesn't fire.
The first thing I thought was the WebApi code is wrong. Obviously. I tried it with Mozilla and Safari but it's working fine.
Idea. I cleaned the cache and other stuff on Microsoft Edge. I click on the button and now everything is working fine! Then I'm so happy and I show my job to my boss on my desktop. On Microsoft Edge I click again on the button and nothing happened. It's working fine with other browsers. I clean again the cache and other stuff on Microsoft Edge and... it's working fine again...
Ok, the problem is the cache (???) but how I can solve in general this problem?
Thank you in advance for any suggestions.
Related
I have created a to-do list app using Node, Express and Mongoose:
To delete a task, the user hits the cross button on the right hand side. This sends a POST request with the task ID to the /delete_task endpoint. The router for this endpoint is /routes/delete_task.js:
var express = require('express');
const Task = require('../models/task');
var router = express.Router();
express.json();
router.post('/', async (req, res, next) => {
const deleted_task = await Task.findByIdAndDelete(req.body.taskID);
console.log('Deleted task: \n', deleted_task);
res.redirect('..');
}
);
module.exports = router;
The router performs a findByIdAndDelete, and then redirects to the home directory. The router for the home directory renders a view of all the existing tasks in the collection, and looks like:
var express = require('express');
const Task = require('../models/task');
var router = express.Router();
/* GET home page. */
router.get('/', function(req, res, next) {
Task.find({}, function (err, result) {
if (err) {console.error(err)};
if (result) {
return res.render('index', {title: 'To-do list', tasks: result})
};
});
});
module.exports = router;
My problem is that when deleting a task, the findByIdAndDelete successfully deletes the task, but this is not reflected in the redirected home page. The deleted task only disappears once I refresh the page. This suggests that it's some kind of async issue, and that the redirect is happening before the findByIdAndDelete query has finished executing.
To address this, I have made the router.post() callback an async function and am using await on the findByIdAndDelete, and I have also tried placing the res.redirect('..') in a callback function of the findByIdAndDelete, which also does not fix the problem:
router.post('/', (req, res, next) => {
Task.findByIdAndDelete(req.body.taskID, (err, result) => {
if (err) {
console.error(err)
};
if (result) {
console.log(result)
};
res.redirect('..');
});
});
I have looked for other questions on stackoverflow, all of which seem to suggest that this is an async issue caused by the redirect happening before the query has finished executing. The suggested solutions I have found were to make the router.post(...) callback an async function and await the result of the Mongoose query, or to place the res.redirect('..') in the callback of the findByIdAndDelete so that the redirect happens after the query has finished executing. I have tried both of these but the problem remained.
The only other thing I can think of is that I am trying to redirect from within a POST request, and I don't know if this is legit. It seems to work fine looking at the log (see last 2 lines where the GET request to / follows the POST request to /delete_task):
New task submitted: cake
New task created successfully: cake
POST /new_task 302 29.747 ms - 46
GET / 200 4.641 ms - 1701
GET /stylesheets/style.css 304 0.849 ms - -
GET /javascripts/delete_task.js 304 0.479 ms - -
Deleted task:
{
_id: new ObjectId("636a993ca0b8e1f2cc79232a"),
content: 'cake',
completed: false,
__v: 0
}
POST /delete_task 302 10.358 ms - 24
GET / 200 3.867 ms - 1348
This is where I've hit a brick wall and I can't see what might be causing the issue. Really appreciate any help or suggestions anyone might have - cheers.
I don't think this is a problem with asynchronousness, because you wait properly before responding to the POST request.
But the res.redirect makes sense only if hitting the cross button navigates from the To-do list page to the /delete_task page and from there back, by virtue of the redirection. This would be possible only with an HTML <form> element that is submitted upon hitting the button.
Is that how you have implemented it? You say that you "send a POST request", but is this through a <form>, or rather through an axios.post or a similar Javascript method? In the latter case, the following would happen:
The Javascript client sends the POST request and the deletion is carried out on the database.
The Javascript client receives a redirection response and sends the GET request.
The Javascript client receives the HTML page for the to-do list as response, but does nothing with it.
In other words: the To-do list page would not be reloaded by this axios.post request. If you want this to happen, don't respond to the POST request with a redirection, but simply with 200 OK, and have the Javascript client execute location.reload() when it receives this response.
I'm using Rotativa to generate a PDF file from a view, which works well, but now on the browser I get the raw file thrown at the console, no download dialog box, no warning, nothing. Here's my code:
Controller
public ActionResult DescargarPDF (int itemId) {
var presupuesto = ReglasNegocio.Fachada.Consultas.ObtenerPresupuesto(itemId);
return new Rotativa.PartialViewAsPdf("_PresupuestoFinal", presupuesto) {
FileName = "Presupuesto_" + itemId + ".pdf",
PageSize = Rotativa.Options.Size.A4
};
}
JQuery script:
$(".convertirPDF").on("click", function (id) {
var itemId = $(this).data('itemid');
Pdf(itemId);
});
function Pdf(itemid) {
var id = itemid;
$.ajax({
method: "POST",
url: 'DescargarPDF',
data: { itemId: id },
cache: false,
async: true,
});
};
Button on the HTML
<button class="convertirPDF btn btn-secondary btn-info" data-itemid="#item.Id">PDF</button>
I've tried several codes on the controller (with same result) since the script and view seems to work fine. However, I'm suspecting, maybe the html or the script need some tuning to inform the browser it has to download the file?
Thanks everyone in advance.
I found a solution. It's not elegant, but it works.
So I didn't need to use ajax necessarily to make the request, neither to give function to the button. I'm kind of sure that the issue has something to do with JS and/or jQuery. Nevertheless, there's a simpler way to do this.
I changed my html button to:
PDF
so it looks like a button but it's really a link to my controller¡s method. I also removed the script for that button and now it downloads the file. Not with the name intended, but still.
Thanks to everyone. Happy coding.
UPDATE
I've been working on the same project, and I think I found out why my PDF file was being thrown into console.
The thing is, jQuery makes the request, so jQuery manages the response. Is that simple. If you check official docs for .post(), you'll see the following:
The success callback function is passed the returned data, which will be an XML root element or a text string depending on the MIME type of the response. It is also passed the text status of the response.
As of jQuery 1.5, the success callback function is also passed a "jqXHR" object (in jQuery 1.4, it was passed the XMLHttpRequest object).
Most implementations will specify a success handler.
And I wasn't, so, by default, it just drop it to console. I hope this throws some light into the issue and helps. Happy coding.
Been doing lots of testing and eventually got everything working in Visual Studio using iisexpress. I've deployed it into IIS via a web package.zip and now my function has stopped working. Any thoughts? Event viewer isn't reporting anything.
I click the button to call the ajax - the GET looks like it is properly formulated but it can't find something - I'm guessing the 'GetData' method. Do I need to change the URL.Action in the project before deploying? (Tried with and without content-type, hence the comment tags)
Ajax call
<script>
$("#button").click(function (event) {
event.preventDefault();
var uquery = $('#HospitalID').val();
var url = '#Url.Action("GetData", "oncologyPatients")';
//alert(uquery); //ENABLE FOR ERROR CHECKING OF VARIABLE
var data = { value: uquery }
$.ajax(
{
url: url,
data: data,
type: 'GET',
//contentType: "application/json; charset=utf-8",
success: function (data)
{
$('#HospitalID').val(data.HID);
$('#NHS_No_').val(data.NHS);
$('#Forename').val(data.Fname);
$('#Surname').val(data.Sname);
$('#DOB').val(data.DOB);
},
error: function (xhr, status, error)
{
alert(status);
alert(error);
var err = xhr.responseText;
alert(err);
}
});
});
Click the button and locally it finds the 'GetData' in the controller and works fine. However, it throws a very generic error when running on IIS 8.5/Server 2012 R2 and I have to go into the dev. tools of Chrome to get something meaningful which is:
Failed to load resource: the server responded with a status of 500 (Internal Server Error)
http://siteuat.local.com/oncologyPatients/GetData?value=11111111
Any ideas, I'm leaning towards absolute URL but it looks as though the route is correct.
All other functions are fine - it's basically just this one call that is having problems.
Figured this out a few minutes after posting - grrr. It was working locally because my user account had permissions to make a TrustedConnection to the database (which I didn't need).
My sqlConnection looked like this:
SqlConnection DWLKup = new SqlConnection("user id=user;" +
"password= password;server=Server\\Instance;" +
"Trusted_Connection=yes;" +
"database=db1; ");
I removed the
"Trusted_Connection=yes;" +
And it's all working now :)
I'm quite new to Nancy so hopefully I'm just doing something silly here. I've got a nancy service which I'm posting data to like so:
$.ajax({
type: 'POST',
url: url,
data: JSON.stringify({
searchTerm: productSearchTerm,
pageSize: pageView.PageSize(),
selectedBrands: pageView.checkedBrands(),
pageNumber: pageView.CurrentPage(),
selectedCategories: pageView.checkedCategories(),
selectedGender: pageView.checkedGender(),
SelectedColours: pageView.checkedColour(),
saleItemsOnly: pageView.saleItemsOnly(),
selectedMinimumPrice: pageView.minPrice(),
selectedMaximumPrice: pageView.maxPrice()
}),
contentType: "application/json; charset=utf-8",
dataType: 'json'
})
.done(function (data) {
bindSearchResult(data);
})
.fail(function (a) {
console.log(a);
});
Then in the service I need to hold on to a bunch of string values for future requests from the user, which I'm doing like this:
private void AddListOfStringToIsSessionNull(string name, IEnumerable<string> data)
{
if (Session[name] == null)
{
Session[name] = data.ToList();
}
}
These seems to set the session variables and an "_nc" cookie is present when I inspect the page after it returns.
However if I then F5 the page the session items are all null again at the server.
I've ruled out are cross site posting as it's all on the same domain.
Could this be an AJAX thing? Seems unlikely as this seems a pretty standard thing to do.
Or can you not set it on a POST?
If so is there a way around this?
If someone could help I'd be forever grateful as otherwise I'm going to have to revert back to writing this in WCF which will make me hurl myself from the window :)
Thanks a lot.
Edit
Open a new incognito window in Chome I hit home page, no nancy cookie
present (which is correct)
Enter a search term which calls back over and AJAX post and grabs JSON, also pops a list of strings in the Nancy Session
Check cookie, a nancy one has appeared like so and the session value is correct on post back:
npTBnqPp99nLd5fU0%2btJbq%2fY%2bdf2UFWTaq5D28Az7Jw%3dzF8cIHNxTWX399sbmowjheho2S29ocpKs1TXD51BrbyPPNCeLfAcYqWhkRHqWdwKJNED5kuspllIjhI5rf2W6NKtf8xo68BlF5eLLgJxMtAxw2yD2ednEzUazq1XBt2Id77t5LE5tZVwkpRGDT5b9J0nQnr9zfzCOALXb2hQQGBPkMVyNNTO24pW1UC6Uda3B86LLYA02Jgy4G9DiT6KsutR3pSXO8AZFOlcmAEHbSSX9A8FAHaL ... etc.
I then search for a different search term which calls this bit of code:
--Session.DeleteAll();
The nancy session is re-populated with new data and returns back to the browser
However at this point the cookie has not been updated with the new value it is still as below:
npTBnqPp99nLd5fU0%2btJbq%2fY%2bdf2UFWTaq5D28Az7Jw%3dzF8cIHNxTWX399sbmowjheho2S29ocpKs1TXD51BrbyPPNCeLfAcYqWhkRHqWdwKJNED5kuspllIjhI5rf2W6NKtf8xo68BlF5eLLgJxMtAxw2yD2ednEzUazq1XBt2Id77t5LE5tZVwkpRGDT5b9J0nQnr9zfzCOALXb2hQQGBPkMVyNNTO24pW1UC6Uda3B86LLYA02Jgy4G9DiT6KsutR3pSXO8AZFOlcmAEHbSSX9A8FAHaL.... etc.
Is there anything else I need to do to solve this?
So my issue was me being a bit daft really, the implementation of the cookie stuff works well, however there were occasions when I was stuffing too much into the cookie and pushing it over the 4K cookie limit.
This meant that I was seeing some inconsistent behavior where sometimes the cookie worked nicely (the cookie was < 4K) where as for some search terms too much was being written into the cookie which meant either the cookie was never created or it was not overwriting the existing cookie.
So yes, my fault, but thought this answer might aid someone as silly as me trying to store the world in a cookie..
Right I'm off to write a session provider.
Been trouble-shooting this for a few days now and have basically run dry of leads.
Here's the code:
[WebMethod]
public static bool EnableEditMode()
{
bool successful = false;
try
{
GlobalSettings globalSettings = StateManager.GetStates<GlobalSettings>();
globalSettings.EditModeEnabled = true;
StateManager.SaveGlobalSettings(globalSettings);
successful = true;
}
catch (Exception exception)
{
_logger.ErrorFormat("Unable to enable edit mode. Reason: {0}", exception.Message);
}
return successful;
}
function EnableEditMode() {
$.ajax({
type: "POST",
url: "Dashboard.aspx/EnableEditMode",
data: "{}",
contentType: "application/json; charset=utf-8",
dataType: "json",
success: function (result) {
if( result ) {
$find(window.leftPaneID).expand(1);
$('#' + window.startEditButtonID).hide();
$('#' + window.finishEditButtonID).show();
}
}
});
}
Here's the error message:
Failed to load resource: the server responded with a status of 404
(Not Found)
http://localhost/csweb/Dashboard/Dashboard.aspx/EnableEditMode
Here's what I've tried:
Ensured that I am update-to-date with Windows Updates. Source
I removed 'EnablePageMethods = True' from my ScriptManager and started using a jquery ajax POST to execute the code. Nothing broke when I did this, the headers changed slightly, but nothing was fixed.
I tried using <%= ResolveUrl("~/Dashboard/Dashboard.aspx") %>, but the path did not change and I did not notice an effect, so I removed the code. Source
I went into my web.config file and removed the following according to Source:
<authorization>
<deny users="?"/>
</authorization>
I've ensured that the file is not ReadOnly and granted full-control permissions on the file and parent-folders for all relevant users on the system. (Not a live system so no worries.. just playing around).
I diff'ed the request headers between my working development and
non-working deployment -- I saw no differences in the request
headers.
I ran Permission Wizard on the website, indicated I wished to have the website security settings of a publicly-viewed website, and applied to all folders replacing current security settings. No effect.
Added .json // application/json MIME type, no effect, but I left it in since it seemed useful.
At this point I am suiting up to trek into the abyss of settings which is IIS. I am not very familiar with IIS 5.1, though. So, I am wondering if there are any specific spots I should start looking?
I found the reason, but I am working on figuring out how to fix it. I have an ASP.NET AJAX application integrated into an MVC solution. The MVC side of things is picking up the PageMethod and not handling it properly, but only under IIS 5.1:
[HttpException]: The controller for path '/csweb/Dashboard/Dashboard.aspx/EnableEditMode' was not found or does not implement IController.
Are you using ASP.NET MVC? You may need [AcceptVerbs ("POST")] on EnableEditMode().
Also, could you try just printing out (or debugging and viewing) the results of:
var pageURL = "<%= ResolveUrl("~/Dashboard/Dashboard.aspx") %>
var pageURL2 = "<%= ResolveUrl("~") %>