Skip to content
CMO & CTO
CMO & CTO

Closing the Bridge Between Marketing and Technology, By Luis Fernandez

  • Digital Experience
    • Experience Strategy
    • Experience-Driven Commerce
    • Multi-Channel Experience
    • Personalization & Targeting
    • SEO & Performance
    • User Journey & Behavior
  • Marketing Technologies
    • Analytics & Measurement
    • Content Management Systems
    • Customer Data Platforms
    • Digital Asset Management
    • Marketing Automation
    • MarTech Stack & Strategy
    • Technology Buying & ROI
  • Software Engineering
    • Software Engineering
    • Software Architecture
    • General Software
    • Development Practices
    • Productivity & Workflow
    • Code
    • Engineering Management
    • Business of Software
    • Code
    • Digital Transformation
    • Systems Thinking
    • Technical Implementation
  • About
CMO & CTO

Closing the Bridge Between Marketing and Technology, By Luis Fernandez

AJAX for Responsiveness: The First Big Leap

Posted on February 28, 2008October 21, 2025 By Luis Fernandez
.ajaxnote { padding: 0.75rem 1rem; background: #f6f8fa; border-left: 4px solid #2d7ef7; font-size: 0.95rem; }\n.codehint { font-size: 0.9rem; color: #444; }\n.smallcaps { font-variant: small-caps; letter-spacing: 0.03em; }\nkbd { background: #eee; border: 1px solid #ccc; border-radius: 3px; padding: 0 0.25rem; font-family: monospace; }\n\n

\n\n

\n

People do not want page loads. They want answers. AJAX is the shortcut between click and result.

\n

\n\n

\n

Remember the first time Gmail did not blink

\n

\n\n

\n

There was that moment when we realized a page did not need to blink after every click. The inbox updated, the sidebar count moved, a message toggled read to unread, and the browser never did the big white flash. That was the instant AJAX went from buzzword to muscle memory. You clicked, the app listened, sent a quiet XMLHttpRequest, and brought back just what it needed. No full reload. No breaking the flow.

\n

\n\n

\n

Right now our daily toolkit has jQuery, Prototype plus scriptaculous, Dojo, YUI, and Ext. Each promises a smoother ride across IE6 and IE7, Firefox 2, and Safari 3. We still baby sit quirks like ActiveX on old IE or that one box model surprise. Yet the prize is clear. Responsiveness is the new bar. If your app makes people wait for a whole page cycle when all they did was flip a flag or peek at suggestions, they will bounce.

\n

\n\n

\n

So this is a field note from the trenches. What worked. What bit us. And why the lessons of AJAX will age well.

\n

\n\n

\n

A day of shaving seconds

\n

\n\n

\n

We were building a customer profile form. Classic stuff. Name, address, a city field that fills the state as you type. The old way did a full submit to check the postal code. That meant validation plus a whole page refresh just to render a tiny red message near one input. You could almost hear the sigh.

\n

\n\n

\n

We swapped that for an AJAX call tied to blur on the postal code field. The app now checks the server in the background, brings back JSON, and paints the hint in place. When the network is fast the user barely notices. When the network is slow we show a tiny spinner only if the request crosses a short threshold, so fast responses do not flicker a spinner. The form feels alive, but it is also progressive. If scripts are off, the submit still validates on the server and renders a full page message. No dead ends.

\n

\n\n

\n

That one change saved a few seconds per edit and removed a full reload. Multiply that across the app and you start to feel a real shift. Less waiting, fewer interrupts, and more focus on the actual task.

\n

\n\n

\n

Deep dive one: the XMLHttpRequest core

\n

\n\n

\n

At the center is XMLHttpRequest. It is old, weird in places, and still the easiest way to fetch data without a full navigation. Here is a small helper that works in modern browsers and also in IE6 with ActiveX.

\n

\n\n

\n
// tiny XHR helper with IE fallback, GET with cache bust, JSON parse\nfunction getJSON(url, params, onSuccess, onError) {\n  var q = [];\n  for (var key in params) {\n    if (params.hasOwnProperty(key)) {\n      q.push(encodeURIComponent(key) + '=' + encodeURIComponent(params[key]));\n    }\n  }\n  // cache bust to bypass proxy and IE aggressive caching\n  q.push('_ts=' + new Date().getTime());\n  var fullUrl = url + (url.indexOf('?') > -1 ? '&' : '?') + q.join('&');\n\n  var xhr = window.XMLHttpRequest ? new XMLHttpRequest() : new ActiveXObject('Microsoft.XMLHTTP');\n  xhr.onreadystatechange = function () {\n    if (xhr.readyState === 4) {\n      var ok = (xhr.status >= 200 && xhr.status < 300) || xhr.status === 304;\n      if (ok) {\n        try {\n          // use json2 if available, fall back to eval guarded by parens\n          var data = window.JSON && JSON.parse ? JSON.parse(xhr.responseText) : eval('(' + xhr.responseText + ')');\n          onSuccess && onSuccess(data, xhr);\n        } catch (e) {\n          onError && onError(e, xhr);\n        }\n      } else {\n        onError && onError(new Error('HTTP ' + xhr.status), xhr);\n      }\n    }\n  };\n  xhr.open('GET', fullUrl, true);\n  xhr.send(null);\n}
\n

\n\n

\n

Notes from real use:

\n

\n\n

\n

� IE tends to cache GET too hard. The timestamp param is your friend. For sensitive data you can also switch to POST.

\n

\n\n

\n

� JSON is lighter than XML and feels natural in JavaScript. Use json2.js from Crockford to avoid eval in older browsers. If you must use eval, wrap the JSON in parens.

\n

\n\n

\n

� Keep payloads small. Remove extra whitespace on the server, send only fields you need. Every byte matters on slow links and on mobile data cards.

\n

\n\n

\n

� Respect the same origin rule. XHR will only talk to the same domain and port. We will talk about safe ways around that in a bit.

\n

\n\n

\n

Deep dive two: update the DOM without breaking the page

\n

\n\n

\n

Fetching is the easy part. Making the page feel stable while it updates is where craft shows. When you replace a chunk of markup with innerHTML, you lose event handlers on the old nodes. Some libraries re bind for you. If you go raw, wire events at a higher container so new nodes are covered. This is the delegation trick:

\n

\n\n

\n
// event delegation for a dynamic list\ndocument.getElementById('todoList').onclick = function (e) {\n  e = e || window.event;\n  var target = e.target || e.srcElement;\n  if (target && target.tagName === 'A' && target.getAttribute('data-action') === 'remove') {\n    removeItem(target.getAttribute('data-id'));\n    return false;\n  }\n};
\n

\n\n

\n

Visual feedback is half the vibe of responsiveness. Show that you heard the click. A quick class change, a disabled button, a tiny spinner after a short delay. The short delay is key. If you show a spinner for a fast response, the eye catches a flicker.

\n

\n\n

\n
// delayed spinner to avoid flicker for fast responses\nvar spinnerTimeout;\nfunction withSpinner(promiseLike, el) {\n  spinnerTimeout = setTimeout(function () { el.style.visibility = 'visible'; }, 120);\n  promiseLike(function done() {\n    clearTimeout(spinnerTimeout);\n    el.style.visibility = 'hidden';\n  });\n}
\n

\n\n

\n

Memory leaks on old IE show up when closures hold on to DOM nodes. A classic pit is storing a node in a long lived array and also assigning a bound function that closes over that same node. Free it when you remove the node. Set node.onclick = null, then drop references.

\n

\n\n

\n

Do not forget people who browse with scripts off, screen readers, or slow devices. Build the form to work with plain HTML submit. Then sprinkle AJAX on top. This is progressive enhancement in practice. Links should resolve to a real page or anchor. Buttons should submit. Then your script intercepts and upgrades the flow.

\n

\n\n

\n

Search engines read HTML that comes from the first response. Content loaded later with XHR is not seen. If a page depends on AJAX to render core content, also offer a crawlable version at the same URL, or render the first state on the server and then layer in updates. Your visitors and your rankings both win.

\n

\n\n

\n

Deep dive three: live data, polling, and safe cross domain tricks

\n

\n\n

\n

People love fresh data. The web was built on request then response, so we fake push with a few patterns.

\n

\n\n

\n

� Simple polling. Ask every N seconds, adjust the pace when idle or active.

\n

\n\n

\n
// simple polling with backoff and stop on error\nvar pollDelay = 3000, maxDelay = 30000, stopped = false;\n\nfunction poll() {\n  if (stopped) return;\n  getJSON('/feed/updates', { since: lastId }, function (data) {\n    renderUpdates(data);\n    // if no new data, back off a little\n    pollDelay = data && data.length ? 3000 : Math.min(pollDelay + 2000, maxDelay);\n    setTimeout(poll, pollDelay);\n  }, function () {\n    // network trouble, try slower, show a subtle notice\n    pollDelay = Math.min(pollDelay * 2, maxDelay);\n    setTimeout(poll, pollDelay);\n  });\n}\npoll();
\n

\n\n

\n

� Long polling. Ask and keep the request open on the server until there is data or a timeout. Then reconnect. This reduces empty responses and feels closer to push. You will want proper server timeouts and a cap on concurrent requests per user.

\n

\n\n

\n

� Streaming with a hidden iframe. Old school but works in browsers that do not like holding XHR open. The server flushes chunks and the page appends them as they arrive. It is quirky to style and to error handle, yet still handy in some corners.

\n

\n\n

\n

Now the cross domain story. XHR will not talk to another domain. Two common routes in use today:

\n

\n\n

\n

� Server side proxy. Your app calls your own server, your server calls the third party, you pass the data back. This keeps secrets on your end and you can cache to cut load. Add rate limits so a single page cannot hammer the target service.

\n

\n\n

\n

� JSONP. Create a script tag, point it at a URL that returns a JavaScript function call with JSON inside. Only works with GET and public data. Watch for trust and keep callbacks narrow to avoid surprises.

\n

\n\n

\n
// basic JSONP call\nfunction jsonp(url, params, cbName, onDone) {\n  params = params || {};\n  params.callback = cbName;\n  var q = [];\n  for (var k in params) q.push(encodeURIComponent(k) + '=' + encodeURIComponent(params[k]));\n  var s = document.createElement('script');\n  s.src = url + '?' + q.join('&');\n  s.async = true;\n  var head = document.getElementsByTagName('head')[0];\n  head.appendChild(s);\n  window[cbName] = function (data) {\n    onDone && onDone(data);\n    // cleanup\n    head.removeChild(s);\n    try { delete window[cbName]; } catch (e) { window[cbName] = void 0; }\n  };\n}\n\n// usage\njsonp('https://api.example.com/search', { q: 'ajax' }, 'onSearch', function (data) {\n  renderResults(data);\n});
\n

\n\n

\n

Security is part of the job. XSS rides in when you inject untrusted HTML. Escape before inserting. Prefer text nodes when you can. For write actions, protect against CSRF. A simple token in a hidden input that the server verifies is a solid start. You can also add same site cookies when the browser supports them, but the token pattern works across our current browser mix.

\n

\n\n

\n

Finally, measure. Firebug gives you a Net panel to see timings and headers. YSlow is great to spot heavy scripts and missing cache headers. Watch the waterfall and you will find easy wins: sprite your icons, gzip text, set far future cache headers on static files, split vendor and app code so you cache vendor longer.

\n

\n\n

\n

Timeless lessons from the first big leap

\n

\n\n

\n

AJAX feels like a trick and a philosophy rolled into one. The trick is small. Send a request in the background and touch the DOM when it returns. The philosophy is bigger. Make the page feel like a living thing. Respect what the person is doing more than what your server wants to render next.

\n

\n\n

\n

Here is the short list I keep on a sticky note:

\n

\n\n

\n

� Do less per click. Move only the pixels that must move. Ship only the bytes you need.

\n

\n\n

\n

� Progressive first. Plain HTML works. Scripts upgrade it. People win either way.

\n

\n\n

\n

� Asynchronous mindset. Stop thinking in page cycles. Think in small events and feedback. A form blur can be a moment to help. A keystroke can fetch suggestions without stealing focus.

\n

\n\n

\n

� Measure the wait. Use Firebug and YSlow. Trim payloads. Cache smart. Cut the number of requests. The network is the slowest part of most apps.

\n

\n\n

\n

� Be kind to older browsers. Test on IE6 and IE7. Watch memory and event leaks. Use a library when it saves your week.

\n

\n\n

\n

� Keep content crawlable. If you fetch key content with XHR, also render a plain view. Search, bookmarks, and email previews still rely on the first HTML that comes down the wire.

\n

\n\n

\n

� Security travels with you. Escape untrusted content. Use CSRF tokens on writes. Treat JSONP like a public window and keep it simple.

\n

\n\n

\n

We are early in this move to responsive web apps. The tools are good and getting better. The browser makers are shipping fixes and speed ups at a steady clip. There is still plenty of weird, from ActiveX to subtle caching. Still, the goal is simple. Reduce the gap between intent and result. That is what people feel. That is what keeps them around.

\n

\n\n

\n

Maybe in a few years we will have new specs for cross site requests, better data pipes, and smoother history control for single page flows. For now, AJAX gave us the first big leap. It taught us to treat the web as a live surface, not a stack of full reloads. Keep the page steady, move just what you need, and let people get on with their work.

\n

\n\n

\n

Try this today: pick one page that does a tiny action with a full submit. Swap it for an AJAX call with a delayed spinner and a server side fallback. Watch how the whole page suddenly feels lighter.

\n

\n\n

\n

If you want a quick starter: load jQuery 1.2, use $.getJSON for reads and $.post for writes, return compact JSON, and remember the cache bust for GET on IE. Keep the server endpoints simple, return only the fields you paint, and handle errors with a small message inline.

\n

Digital Experience Multi-Channel Experience

Post navigation

Previous post
Next post
  • Digital Experience (94)
    • Experience Strategy (19)
    • Experience-Driven Commerce (5)
    • Multi-Channel Experience (9)
    • Personalization & Targeting (21)
    • SEO & Performance (10)
  • Marketing Technologies (92)
    • Analytics & Measurement (14)
    • Content Management Systems (45)
    • Customer Data Platforms (4)
    • Digital Asset Management (8)
    • Marketing Automation (6)
    • MarTech Stack & Strategy (10)
    • Technology Buying & ROI (3)
  • Software Engineering (310)
    • Business of Software (20)
    • Code (30)
    • Development Practices (52)
    • Digital Transformation (21)
    • Engineering Management (25)
    • General Software (82)
    • Productivity & Workflow (30)
    • Software Architecture (85)
    • Technical Implementation (23)
  • 2025 (12)
  • 2024 (8)
  • 2023 (18)
  • 2022 (13)
  • 2021 (3)
  • 2020 (8)
  • 2019 (8)
  • 2018 (23)
  • 2017 (17)
  • 2016 (40)
  • 2015 (37)
  • 2014 (25)
  • 2013 (28)
  • 2012 (24)
  • 2011 (30)
  • 2010 (42)
  • 2009 (25)
  • 2008 (13)
  • 2007 (33)
  • 2006 (26)

Ab Testing Adobe Adobe Analytics Adobe Target AEM agile-methodologies Analytics architecture-patterns CDP CMS coding-practices content-marketing Content Supply Chain Conversion Optimization Core Web Vitals customer-education Customer Data Platform Customer Experience Customer Journey DAM Data Layer Data Unification documentation DXP Individualization java Martech metrics mobile-development Mobile First Multichannel Omnichannel Personalization product-strategy project-management Responsive Design Search Engine Optimization Segmentation seo spring Targeting Tracking user-experience User Journey web-development

©2025 CMO & CTO | WordPress Theme by SuperbThemes