You are viewing limited content. For full access, please sign in.

Question

Question

Preventing False License Usage in WebLink

asked on December 26, 2014 Show version history

Since around version 7.x a possible severe bug has plagued the application in to where WebLink will consume every license assigned to it when a user and/or bot visits the site with cookies disabled.

Although WebLink displays a warning to the client this still does not prevent it from opening a connection. In fact, when cookies are disabled, WebLink will create a new connection on every click.

This video demonstrates the problem whenever a client has cookies disabled and navigates through a WebLink 8.x website. This also occurs whenever certain browsers go "Incognito" as well. It’s worth noting though that it is not known if this issue is due to WebLink itself and/or server configuration.

The method below is one way of ensuring that clients, who have cookies disabled, do not use up valuable licenses and also more importantly is that it does not give a false indication of heavy usage of a WebLink website.

  1. Open “Login.aspx” in any code/text editor.
  2. Copy the following code and paste it underneath the “</html>” tag;
    <script language="vbscript" runat="server">
       
        Protected Sub Page_PreInit(sender As Object, e As EventArgs) Handles MyBase.PreInit
            If (Request.Cookies("CookieCheck")) Is Nothing Then
                If Me.IsCookieDisabled Then
                    Try
                        ' If there is already a connection open, terminate it
                        Dim Conn As WebLinkControls.WLConnection = Session("WLConnection")
                        If Not IsNothing(Conn) Then
                            Conn.Terminate()
                            WebLinkControls.WLSession.End(Session)
                        End If
                    Catch ex As Exception
                    End Try
                    Dim error_message As String = "In order to use WebLink you must have cookies enabled. Please enable cookies in your browser and try your request again. If you are still receiving this message after enabling cookies please contact the administrator."
                    Session("ErrorMsg") = error_message
                    Server.Transfer("~/Error.aspx", False)
                End If
            End If
        End Sub
        
        Private Function IsCookieDisabled() As Boolean
            Dim currentUrl As String = Request.RawUrl
                If currentUrl.Contains("cc=1") = False Then
                    Try
                        Dim c As HttpCookie = New HttpCookie("CookieCheck", "1")
                        c.HttpOnly = True
                        c.Expires = DateTime.Now.AddDays(1)
                        Response.Cookies.Add(c)
                        If currentUrl.IndexOf("?") > 0 Then
                            currentUrl = currentUrl + "&cc=1"
                        Else
                            currentUrl = currentUrl + "?cc=1"
                        End If
                        
                        Response.Redirect(currentUrl)
                    Catch
                    End Try
                End If
            If Not Request.Browser.Cookies OrElse Request.Cookies("CookieCheck") Is Nothing Then
                Return True
            End If
            Return False
        End Function
    </script>
  3. Save the file (no recompiling necessary).

 

The code above will check to truly see if the connecting client has cookies enabled and if not will redirect the client to a warning and also, more importantly, will stop WebLink from making a connection.

 

UPDATE: The code has been modified to accept re-written URLs in WebLink along with standard URLs.

 

Hope this helps!

 

5 0

Replies

replied on March 16, 2017

Hi, we tried this and are still getting hanging sessions. Any advice?

0 0
replied on March 29, 2017

Daniel,

 Which version are we talking about? This was implemented by Laserfiche after version 8.x

 

Wes

0 0
replied on March 31, 2017

We're on the most current version. 9 Service Pack 1 (9.0.1.275)

Still having issues with this.

0 0
replied on March 31, 2017

You may want to see about blocking bots/spiders all together to see if this may be the problem. Just in case you do not have a robots.txt file in the root directory of Weblink then create one and insert the following into it:

 

User-agent: *
Disallow: /

 

Now keep in mind this will only block those that honor the robots.txt file.

0 0
replied on April 21, 2017

I've done that. We've tried so many of your solutions and it's still happening... Any more ideas? (I really appreciate all your help by the way.)

 

https://answers.laserfiche.com/questions/53691/Weblink-Log 
https://support.laserfiche.com/forums.aspx?Link=viewtopic.php%3ft%3d16125%26amp 
https://answers.laserfiche.com/questions/54336/Public-Accessing-WebLink-and-Licenses-Not-Timing-Out 
https://answers.laserfiche.com/questions/53691/Weblink-Log

0 0
replied on April 24, 2017

Daniel,

 Have a look at the following post (old):

https://support.laserfiche.com/Forums.aspx?Link=viewtopic.php%3ft%3d18065%26amp%3bhighlight%3dlicence%2blicense%26amp

 

Another thing that you can test is when all the licenses have been used up go into the Laserfiche Administrator and force disconnect the licenses. One thing to look for is to see how quickly the come back. This will tell you if it's either user related or bot related.

 

Wes

 

1 0
replied on May 1, 2017

Hey Wes,

I did look thru that post. I think I'm doing it right... but it keeps happening. Their robots.txt file is in the 
c:\Program Files\Laserfiche\Weblink\ 
and 
c:\Program Files\Laserfiche\Weblink\Web Files\ 
directories

 

and this is what's in it.

 

User-agent: *
Disallow: /MyWebLink.aspx
Disallow: /Login.aspx
Disallow: /Search.aspx
Disallow: /
User-agent: AhrefsBot
Disallow: / 
<case match="(bot|spider)"> 
          crawler=true 
          tagwriter=System.Web.UI.HtmlTextWriter 
</case> 
<case match="AhrefsBot"> 
          browser=ahrefsbot
          crawler=true 
          tagwriter=System.Web.UI.HtmlTextWriter 
</case> 
<case match="+Googlebot"> 
          browser=googlebot
          crawler=true 
          tagwriter=System.Web.UI.HtmlTextWriter 
</case> 
 

 

From our IIS logs it looks like Googlebot and AhrefsBot are the ones accessing it.

 

What am I missing?

BTW - thanks for all your help.

0 0
replied on June 26, 2018

Has anyone tried this with Weblink 10.1?  After we upgraded we started getting Bots all the time.  We had never implemented any of these blocks before.

2 0
replied on March 20, 2019

Wondering if this code will work with Weblink 10.1.

0 0
replied on March 20, 2019

Sean,

 This logic was implemented in WebLink 10.x. The above code was for version 8 or below.

1 0
replied on June 2, 2022

Are you saying the underlying code should now block connections that have cookies disabled since v10? We were seeing this on v10.x, and we're still experiencing it now that we've upgraded to v11.

0 0
You are not allowed to follow up in this post.

Sign in to reply to this post.