To avoid this micro content management system can be designed to extract the maximum potential from the content and at the same time the data stored as XML within a native XML must searchable using Xpath. This gives to execute Xpath via a URL from within user's browser.
This makes the limitations on how one can use and reuse his content on the site. It depends on the mark up of the data by the user and resists spamming or illegal posting by using. However a fast, stable, clean PHP code is required. It should be designed in such a way that beginners can learn and advanced programmers can make complex modifications. The licensing should ensure that the programmers must be able to learn and improve from it. This may save the situation of having no firewall at the arrangement and even prevent intruders up to some extent to use a database process. Several improvements must be made by the addition of several tools to prevent and remove spam from the Web logs. These tools could be bundled under a new section and can be named as 'spam smasher'. A spam tool must present to search for spam and delete all the matches. A centralized version of the 'Blocked phrases' tool's presence is needed in addition to the local version. Spam recognizing capability must be improved according to the need. The track backs system must be hardened from time to time. It must be easier to the user to remove the spam with one click and he must be able to report it immediately, so that the administrator can include it in Global Blocked Phrases list. A spam log must be present to detect, check, prevent, and remove spam from the sites and web logs. Keeping all the vulnerable sections in mind one has to maintain a high level of security for the site. The built-in blockers for referrer, comment and track back spam should be present. Not only do these block the spam as it comes in, they must also help a great deal in preventing the spammers from overtaxing your web server.
This is feasible if the knowledge about security strategies and awareness was enough. The first thing that should one keep in mind is that the users and processes must have access to only required privileges but not more. If not the case mentioned here might happen. The individuals visit blogs and post their adverts, along with a link to whatever dodgy website they were promoting. Now a days it has been automated. Some clever programmers working for one of the outfits has written a tool that goes around a list of web logs and collects information on the various ports made to it. It then can create a right HTML to fool the blogging software into thinking that a comment has been entered, and the resulting advert is posted to the blog as if it was legitimate.
After all, a public blog with an accessible comments page is hardly a closed system, and even if you have an acceptable use policy saying what sort of postings you welcome, that is not legally binding either. To avoid this one has to put strong restrictions on posting comments which undermines the primary objective of the blog of removing restrictions on online chatting. For this automated tagging process may be suggested. This is an