2.0 Meeting ? July 7, 2009

2.0 Meeting ? July 7, 2009

Last modified by Hal Eden on 2009/08/02 08:56

2.0 Everywhere
?
Some Background Information

Gerhard Fischer

2.0 Meeting ? July 7, 2009 2

2.0 Everywhere: Overview 2

2.0 Concepts and Size of Repositories 2

A Tag Cloud Representing 2.0 Concepts 2

Size of Repositories Created by 2.0 Cultures 3

2.0 Example Domains 3

Vatican 2.0: Pope gets his own YouTube channel 3

Health 2.0 5

Disaster 2.0 5

President 2.0 5

Learning 2.0 7

Electricity 2.0: Using the Lessons of the Web to Improve Our Energy Networks 8

Government 2.0 (ACM) 8

What Is Web 2.0: Design Patterns and Business Models for the Next Generation of Software 9

Workshops 10

Web 3.0: Merging Semantic Web and Social Web - (SW)^2 10

Adaptation and Personalization for Web 2.0 ? AP-WEB2.0 11

Intelligent Techniques for Web Personalization & Recommender Systems 12

see:

2.0 Everywhere: Overview

The 2.0 paradigm (fostering and supporting social production and mass engagement and collaboration) has been spreading to all areas of human activity:

  • Web 2.0 ? User-generated content provided by participants worldwide dominates new information environments in all areas including: open source software (such as Linux), encyclopedias (such as Wikipedia), photo and movie sharing sites (such as Flickr and YouTube), 3D models (such as the 3D Warehouse used for Google Earth), and social networking sites (such as Facebook). Some of these developments are described in more detail below.
    * Learning 2.0 ? New models of learning, integrating formal and informal learning, are focused on communities of learners engaged in collaborative knowledge construction rather than on one-sided processes in which only teachers are responsible for instructionist learning.
    * Science 2.0 ? The expansion of traditional scientific methods (as argued by Ben Shneiderman) to deal with complex issues that arise as social systems meet technological innovation. This expansion is driven by exploiting the possibilities of the networked information society.
    * Electricity 2.0 ? This development applies the lessons of the Web to improve our energy network by arguing that power distribution has been a top down, subscribe only model, but the electricity grids of tomorrow will greatly benefit from informed users taking an active part in using the smart grid for saving energy.
    * President 2.0 ? This reconceptualization transforms government from a system in which officials hand down laws and provide services to citizens to one using the Internet to let citizens, corporations, and civil organizations work together with elected officials to develop solutions.
    * Cognitive-Levers 2.0 ? this is our research effort in the CLever Project to develop socio-technical environments for people with cognitive disabilities in which caregivers (parents, assistive technology teacher) are empowered to engage in cultures of participation to share information and modify and evolve systems to fit the unique needs of individuals with cognitive disabilities.

2.0 Concepts and Size of Repositories

A Tag Cloud Representing 2.0 Concepts

web-20.png

Size of Repositories Created by 2.0 Cultures

  • YouTube 

    • 10 billion video views per month ( third most active website in the world)
    • location of choice to present personal videos, new product demos, and political agendas
  • Wikipedia

    • more than 10 million articles in 250 languages
    • one of the top ten websites in the world
  • Facebook, MySpace, LinkedIn, and other social networking sites:

    • are expected to grow to 1 billion participants by 2012 (Alexa, 2009)
  • Billions of people 

    • contribute knowledge and opinions through wikis, discussion forums, and blog communities,
    • build collective intelligence by tagging photos, rating movies, reviewing restaurants, and commenting on political events

2.0 Example Domains

Vatican 2.0: Pope gets his own YouTube channel

http://apnews.excite.com/article/20090124/D95TBLP00.html

Jan 24, 1:48 AM (ET) ? By NICOLE WINFIELD

VATICAN CITY (AP) - Puffs of smoke, speeches in Latin and multipage encyclicals have all been used by the Vatican to communicate with the faithful. Now the pope is trying to broaden his audience by joining the wannabe musicians, college pranksters and water-skiing squirrels on YouTube. In his inaugural YouTube foray Friday, Pope Benedict XVI welcomed viewers to this "great family that knows no borders" and said he hoped they would "feel involved in this great dialogue of truth."

"Today is a day that writes a new page in history for the Holy See," Vatican Radio said in describing the launch of the site, http://www.youtube.com/vatican

The Vatican said that with the YouTube channel, it hoped to broaden and unite the pontiff's audience - an estimated 1.4 billion people are online worldwide - while giving the Holy See better control over the pope's Internet image.

The pontiff joins President Barach Obama, who launched an official White House channel on his inauguration day, as well as Queen Elizabeth, who went online with her royal YouTube channel in December 2007.

For the Vatican, it was the latest effort to keep up to speed with the rapidly changing field of communications and new media. For a 2,000-year-old institution known for being very set in its ways, it was something of a revolution.

At the same time, though, the pope warned he wasn't embracing virtual communication without some reservation.

In his annual message for the World Day of Communication, Benedict praised as a "gift to humanity" the benefits of social networking sites such as Facebook and MySpace in forging friendships and understanding.

But he also warned that virtual socializing had its risks, saying "obsessive" online networking could isolate people from real social interaction and broaden the digital divide by further marginalizing people.

And he urged producers of new media to ensure the content respected human dignity and the "goodness and intimacy of human sexuality."

The 81-year-old pope has been extremely wary of new media, warning about what he has called the tendency of entertainment media, in particular, to trivialize sex and promote violence.

But Monsignor Claudio Maria Celli, who heads the Vatican's social communications office, said the pope fully approved of the YouTube channel, saying Benedict was "a man of dialogue" who wanted to engage with people wherever they were.

In that way, he is merely following in the footsteps of Pope John Paul II, who avidly used mass media and information technology to get out his message. John Paul oversaw the 1995 launch of the Vatican's Web site, which today includes virtual tours of the Vatican Museums and audio feeds from Vatican Radio.http://www.vatican.va

While John Paul wasn't a big computer user, he did tap out a very public online message in 2001, an apology for missionary abuses against indigenous peoples of the South Pacific.

Under John Paul, the Vatican also jumped on the text-messaging bandwagon, sending out daily texts with the pope's prayer of the day.

The Vatican's press office even alerted the world of John Paul's April 2, 2005, death by sending an e-mail with a text-messaged alert to journalists.

Asked if Benedict himself surfs the Web, Celli quipped: "Knowing him, that he's a man of research, a man who is up to speed with things, I'd have to respond affirmatively."

One of his advisers, Cardinal Crescenzio Sepe, the archbishop of Naples, has gone a step further: He has his own Facebook profile. So does Cardinal Roger Mahony, archbishop of Los Angeles.

Celli said the Vatican was mulling over a similar presence on Facebook.

While the YouTube initiative was novel, it was in keeping with the Church's history of using whatever means available to communicate: parchment, printing press, radio, television and Internet, noted Monsignor Robert Wister, professor of church history at the Immaculate Conception School of Theology at Seton Hall University in New Jersey.

What is significant about the YouTube initiative, he said, was that "it's a way of communicating the church's message beyond the members of the Church."

In all, YouTube owner Google, Inc. counts over 1,000 institutions and other content producers that have their own channels. Google says hundreds of millions of online videos are watched every day on YouTube.

Celli said the Vatican was launching the channel in part to have some control over the pontiff's online image, which he said already was being used on sites respectful of the papacy and others that are not.

A search of "Pope Benedict XVI" Friday turned up videos of a fake pope dancing and juggling and images of the real Benedict doctored to show him dressed as a superhero, Darth Vader and Yoda.

While there is little the Vatican can do legally to shut down blasphemous or pornographic sites, Celli said it can at least control the content of what it puts on its own channel.

"We just want to put the images at the disposition of the public in the correct way," said Vatican spokesman Rev. Federico Lombardi.

He said no money exchanged hands to launch the channel and the Vatican wouldn't earn anything with publicity.

He said for now, there were limited plans to use YouTube's interactive options, saying the Vatican would receive messages but not respond to them.

The Vatican will update the YouTube site daily with papal news items produced by the Vatican television station CTV and Vatican Radio. They will be translated into Italian, German, English and Spanish.

Google's managing director for media solutions, Henrique de Castro, said Google was working to ensure the site was available in China, where authorities occasionally block foreign news sites. On Friday it was, but church authorities have accused Beijing in the past of blocking the faithful's access to the pontiff's messages.

Health 2.0

http://www.businessweek.com/magazine/content/08_50/b4112058194219.htm

Medicine has always been a top-down affair. Doctors, drug companies, regulators, and researchers are the expert gate-keepers, telling patients what they need to know. Even their own medical records are locked away to protect their privacy. So what would happen if critically ill patients joined together, obtained their personal information, and made it public?

Just such a real-world experiment is under way at a Web-based social network started by the company PatientsLikeMe. The two-year-old venture has already signed up 23,000 participants in five chronic-illness categories?amyotrophic lateral sclerosis (ALS), Parkinson's disease, HIV/AIDS, multiple sclerosis, and mood disorders.

Disaster 2.0

http://americancity.org/daily/entry/80/

Now, with the increasing availability of Geographic Information System (GIS) applications, it looks like we may soon arrive at a place where disasters are documented in real time and tracked on disaster maps that anyone with access to the web?but especially first responders?can use. Researchers at San Diego State University are moving wildfire mapping in just this direction. As the most recent fires were underway, major newspapers featured Google maps with flags pointing out active fires, their paths, and their swaths of destruction.

from Sophia Liu:

The video talks about Ushahidi crowdsourcing eyewitness reports during crises but also how they are now starting to work on how to filter crowdsource information. That's definitely the next big step for all of us in this domain.

http://www.ted.com/talks/erik_hersman_on_reporting_crisis_via_texting.html

President 2.0

http://www.newsweek.com/id/170347/output/print

Obama harnessed the grass-roots power of the Web to get elected. How will he use that power now?

Daniel Lyons and Daniel Stone

NEWSWEEK ?From the magazine issue dated Dec 1, 2008

Barack Obama is the first major politician who really "gets" the Internet. Sure, Howard Dean used the Web to raise money. But Obama used it to build an army. And now, that army of digital kids expects to stick around and help him govern. Crowd-sourced online brainstorming sessions? Web sites where regular folks hash out policy ideas and vote yea or nay online? A new government computer infrastructure that lets people get a look into the workings of Washington, including where the money flows and how decisions get made? Yes to all those and more. "This was not just an election?this was a social movement," says Don Tapscott, author of "Grown Up Digital," which chronicles the lives of 20-somethings raised on computers and the Web. "I'm convinced," Tapscott says, "that we're in the early days of fundamental change in the nature of democracy itself."

Call it Government 2.0. Instead of a one-way system in which government hands down laws and provides services to citizens, why not use the Internet to let citizens, corporations and civil organizations work together with elected officials to develop solutions? That kind of open-source collaboration is second nature to the Net-gen kids who supported Obama and to technologists from Silicon Valley who are advising him. "An open system means more voices; more voices mean more discussion, which leads to a better decision," Google CEO and Obama adviser Eric Schmidt told a roomful of policy thinkers in Washington last week, gathered for a discussion on the role technology will play in government. "A community will always make a better decision than an individual."

Obama's transition team is already building an organization to carry on the Internet efforts begun during the campaign. On the stump, Obama laid out plans for a technology czar in his administration?a senior-level, or even cabinet-level, post that he promised would make his White House transparent and ultra-efficient. Obama has talked about streaming portions of cabinet meetings live on the Internet in order to reach more people, and not long after his election he gave one of his first "radio" addresses in video form on YouTube. He's also asked that candidates for jobs in his administration submit their information online, so more than just Washington insiders would be considered.

"New media will be at the center of the action, helping the entire executive branch run faster," says Thomas Gensemer, managing partner of Blue State Digital, the Washington, D.C., tech strategy firm that built the Obama campaign's social networking site, my.BarackObama.com. Gensemer expects the fired-up Obama army to stay committed to the cause. "If anything, with Obama now in office, they'll want to participate more, not less, and take part in the governing process," he says. (That's not the case for some of the young turks who helped Obama build his Web campaign. Joe Rospars, who ran Obama's Internet team, is returning to Blue State Digital, which he cofounded in 2004. Other top staff expressed privately that the bigger opportunities and money will be found in dotcom, not dotgov.)

Continuing the Internet efforts of the campaign raises some tricky legal questions. One challenge is figuring out how to keep using the personal data gathered from more than 10 million supporters during the campaign. Federal election rules prohibit President Obama from interacting with supporters in the same way as Candidate Obama did. When he becomes everybody's president, the law says he can't communicate only with the people who voted for him. Like his recent predecessors, he'll have to use the WhiteHouse.gov Web site to make sure everyone's included. Transition officials are looking for ways to sidestep the rules. One maneuver they're considering involves setting up a nonprofit organization that would purchase the Obama supporter lists (names, phone numbers, e-mail addresses) from the campaign, says Steve Hildebrand, former deputy manager of the campaign. The nonprofit would serve as a conduit, letting the administration maintain indirect contact with supporters. The nonprofit, likely to be set up as a 501(c)(3) tax-exempt organization, could encourage supporters to push legislators on policy issues by, say, flooding a Senate office with phone calls and e-mails, or arranging demonstrations via Facebook to push for universal health care.

Federal disclosure laws could further limit Obama's participation in all this new Internet activity. Statutes say that any official correspondence from the president becomes property of the office, not the man in it. The rules were drafted at a time when the president's sole communication was on paper, and there wasn't that much of it. But now, with things like e-mail and instant messaging, the most mundane messages from or to Obama would become government property, and much of it would eventually be accessible to the public under the Freedom of Information Act. For this reason, Obama earlier this month started to wean himself from his BlackBerry. If he wanted to, he could choose to keep it. But if he did, he'd have to acknowledge that a historian decades from now could study just how much time the president spent bantering with pals or gushing about the White Sox. "He'll be restricted by how much information about him will become public property," says Lawrence Lessig, founder of the Center for Internet and Society at Stanford. "This is an area where the statutes are far out of date for the current technology." Security officials also worry about Obama using the device for official business, fearing a hacker could gain access to internal deliberations.

But maybe Obama, who espoused openness on the campaign trail, should just hang on to his BlackBerry and not worry about what historians think. (NEWSWEEK's Jonathan Alter believes that's the right way to go.) Ellen Miller, director of the Sunlight Foundation, which advocates for government transparency, expects technology in an Obama administration will have two components: transparency and connectedness. Transparency means using technology to open the windows of government, allowing all Americans with a computer to supervise the officials they've elected, starting with Obama. The president-elect has talked about crafting a user-friendly portal where people could look up and comment on legislation before he signs it.

Connectedness, Miller says, means allowing people outside government to have a bigger role in crafting policy (or at least feel like they have a bigger role). It might mean a period of a few days for open comments on newly passed legislation before Obama signs it into law, or administration-sponsored wiki Web sites that would let users make suggestions on budget bills, which are often notoriously opaque. One example that already exists is a privately run "social action" Web site called Change.org. An idea board on the site allows users to make suggestions, then other users give an up or down vote on what has been put forward, much like on the news and article aggregation Web site Digg.com. "Close Guantánamo prison camp," is currently the top-rated idea.

The trick for Obama will be to lead the Netroots movement rather than be led by it. Tapscott, the author of "Grown Up Digital," thinks there's a real risk of backlash if the kids who supported Obama feel their hero has let them down. "If he betrays this generation, the protests of the '60s will look like a tea party," Tapscott says. But Markos Moulitsas, captain of the liberal blog DailyKos.com and an occasional NEWSWEEK contributor, doesn't think Obama's base would turn on him. "If they get disillusioned, they'll probably just become apathetic again," he says. "I couldn't see disappointed supporters becoming enraged against him. "

Whatever the risks, the president-elect has made it clear he wants all those voices at the table, building a grass-roots-style government that won't always agree with him. That could mean tens of millions of voices, all with different thoughts and priorities, constantly fighting for one man's ear. One thing we sometimes overlook in our tech-obsessed culture is that technology in and of itself doesn't automatically speed things up. It could, in fact, slow things down.

With Barrett Sheridan

Learning 2.0

characterized by:

  • cultures of participation: Tinkering, Building, Remixing, Mash-Ups & Sharing
  • active blending: Researching, re-searching & learning meld into a new kind of distributed learning/knowledge ecosystem with mentors: peer based & masters
  • from instruction to interest-driven participation
  • learning on demand
  • a new Culture of Learning = a culture that thrives on participatory life long learning

Electricity 2.0: Using the Lessons of the Web to Improve Our Energy Networks

http://www.web2expo.com/webexberlin2008/public/schedule/detail/4907

Tom Raftery (RedMonk), James Governor (RedMonk)

14:35 Thursday, 23-10-2008

Presentation: Electricity 2_0_ Using the Lessons of the Web to Improve Our Energy Networks Presentation ? check out the [PDF] file

For too long, power distribution has been a top down, subscribe only model, but the electricity grids of tomorrow will be read/write, just like the Web. It?s a commonplace to talk about how IT should be delivered as a utility, but what about delivering a utility the same way the Web works? Utilities need to become more like the Internet: disparate, disconnected electrical grids will be joined up to give us one global electricity super-grid. Imagine the resilience: electricity that can route around problems.

Think about how much more stable the super-grid would be if the excess energy produced by, for instance, Scandinavian wind farms on windy nights could simply be sold to meet capacity shortages in the U.S. as people arrive home from work, or in Japan as they start to wake up.

What if the grid were smart, publishing prices in real time, based on supply and demand fluctuations? And further, what if smart meters in homes and businesses could adjust appliances based on the real-time pricing (thermostats up/down, devices on/off, etc.) And what if, again like the Internet, the super-grid were read/write, i.e., if you could be a producer as well as a consumer?

In this talk Tom Raftery will explain how this vision will be realized, which companies and geographies are leading the charge, and what you should to do to encourage the change.

Government 2.0 (ACM)

http://www.acm.org/public-policy/open-government

ACM U.S. Public Policy Committee (USACM) ? Recommendations on Open Government

ACM Urges Steps to Transform Government for a Web 2.0 World -Recommendations Aimed at Advancing Administration?s Directive on Transparency and Open Government

BACKGROUND

Citing trends showing that individual citizens, companies, and organizations are using technology to analyze government-compiled data in creative and collaborative ways, the USACM statement urges government policies that will promote a dynamic force of third-party Internet sites and tools to enhance the usefulness of government data.  The statement follows the recent release of the Administration?s Memorandum on Transparency and Open Government, which is intended to establish a system of transparency, public participation, and collaboration. 

?Technology has given us powerful new tools for data gathering, analysis, social interaction and collaboration,? said Edward Felten, Vice Chair of USACM. ?Internet users are combining and analyzing information in innovative ways that go beyond what the data?s original publishers imagined. Government has a treasure trove of data and it can unleash creative new analysis by giving users access to this data in a format that allows them the advantage of easy, fast integration, machine-readability, download capability, and authenticity measures,? added Felten, Professor of Computer Science and Public Affairs, and Director of the Center for Information Technology Policy at Princeton University. 

Computing and networking technology has made it easier than ever before for organizations and individuals to share, analyze and understand large bodies of information. Government agencies and legislators have long recognized the value of the Internet, having helped to create it, and share a strong commitment to providing for the information needs of citizens and others.

Government agencies increasingly post information -- often for the benefit of individual citizens -- on the Internet and through the World Wide Web (WWW). The U.S. Public Policy Committee of the ACM (USACM) applauds ongoing efforts to make these data as accessible as possible to all Americans. However, law, custom and technology have all contributed to diverse and often inconsistent forms of publication for the data provided.

Many Internet users are learning to control their online experience, including combining and analyzing information in innovative ways that go beyond what the data's original publishers imagined. Individual citizens, companies and organizations have begun to use computers to analyze government data, often creating and sharing tools that allow others to perform their own analyses. This process can be enhanced by government policies that promote data reusability, which often can be achieved through modest technical measures. But today, various parts of governments at all levels have differing and sometimes detrimental policies toward promoting a vibrant landscape of third-party web sites and tools that can enhance the usefulness of government data.

USACM makes the following policy recommendations for data that is already considered public information.

RECOMMENDATIONS

  • Data published by the government should be in formats and approaches that promote analysis and reuse of that data.
  • Data republished by the government that has been received or stored in a machine-readable format (such as online regulatory filings) should preserve the machine-readability of that data.
  • Information should be posted so as to also be accessible to citizens with limitations and disabilities.
  • Citizens should be able to download complete datasets of regulatory, legislative or other information, or appropriately chosen subsets of that information, when it is published by government.
  • Citizens should be able to directly access government-published datasets using standard methods such as queries via an API (Application Programming Interface).
  • Government bodies publishing data online should always seek to publish using data formats that do not include executable content.
  • Published content should be digitally signed or include attestation of publication/creation date, authenticity, and integrity.

What Is Web 2.0: Design Patterns and Business Models for the Next Generation of Software

http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html

by Tim O'Reilly

09/30/2005

The bursting of the dot-com bubble in the fall of 2001 marked a turning point for the web. Many people concluded that the web was overhyped, when in fact bubbles and consequent shakeouts appear to be a common feature of all technological revolutions. Shakeouts typically mark the point at which an ascendant technology is ready to take its place at center stage. The pretenders are given the bum's rush, the real success stories show their strength, and there begins to be an understanding of what separates one from the other.

The concept of "Web 2.0" began with a conference brainstorming session between O'Reilly and MediaLive International. Dale Dougherty, web pioneer and O'Reilly VP, noted that far from having "crashed", the web was more important than ever, with exciting new applications and sites popping up with surprising regularity. What's more, the companies that had survived the collapse seemed to have some things in common. Could it be that the dot-com collapse marked some kind of turning point for the web, such that a call to action such as "Web 2.0" might make sense? We agreed that it did, and so the Web 2.0 Conference was born.

In the year and a half since, the term "Web 2.0" has clearly taken hold, with more than 9.5 million citations in Google. But there's still a huge amount of disagreement about just what Web 2.0 means, with some people decrying it as a meaningless marketing buzzword, and others accepting it as the new conventional wisdom.

Workshops

Web 3.0: Merging Semantic Web and Social Web - (SW)^2

http://www.cs.pitt.edu/%7Erosta/EICK/

in conjunction with

the 20th ACM Conference on Hypertext and Hypermedia

June 29, 2009 - Turin, Italy

MOTIVATIONS

Web 2.0 introduced the remarkable phenomenon of user-generated content. Large numbers of most popular sites on the Web are currently mainstream Web 2.0 applications with rich user-generated content. Wikipedia, YouTube, Flickr, and del.icio.us are classical examples of those sites. Web 2.0 allows users to do more than just retrieving information, since it is based upon architecture of participation that reduces the barriers of online collaboration and encourages the generation and distribution of content. For this reason it is also called as Social Web. Users are encouraged to provide data and metadata in simple ways such as tagging, ratings, comments, and blogging. As a result, Web 2.0 applications are collecting large amount of data. However, this data is poorly structured, highly subjective and often buried in a low-quality content. The more data, the more challenging it becomes for users to find relevant information. Social web applications prefer not to deal with this problem and simply present this content to users as it is, in a form of fuzzy aggregations, such as scattered tag clouds and folksonomies that become very confusing and ineffective for users.

Adding formal semantics to tags can be an important step in the direction of a better navigation and searching and should help to transform tag clouds and folksonomies into valuable aggregations representing a sort of collective knowledge". There is the opportunity to exploit such Web 2.0 collective knowledge (together with the individual user knowledge) in order to achieve the vision of Web 3.0. Web 3.0, also called the Intelligent Web, refers to the provision of a more productive, personalized and intuitive environment through the integration of Semantic Web and in general Artificial Intelligence technologies emphasizing the information understanding. Semantics seems a necessary part of the next generation of the Web. The information has to be structured in such a way that machines can read and understand it as much as humans can, without ambiguity. Collective knowledge can also represent a useful source for adaptive applications, since tags (and folksonomies) represent a novel aspect to be considered in any collaborative scenario.

The main goal of this workshop is to provide a forum where current researchers and practitioners from different fields can meet and discuss the state of the art and latest ideas and issues in the use of collective knowledge, user modelling through semantic social web.

During the workshop, participants will interact face-to-face to formulate a coherent synthesis of the contributions. After the workshop, interested participants will help to bring the results into a form suitable for publication (e.g., as a journal article or a special issue).

TOPICS

The focus of the workshop includes but is not restricted to the following topics:

  • Characterization of Web 3.0
  • Advantages of ?semantic social web? over semantic or social web
  • Creation of ?structured? collective knowledge from users? contributions
  • Folksonomies vs controlled vocabularies or ontologies
  • Semantic tagging and annotation for social web
  • Information retrieval in the Web 3.0 scenario
  • Personalization and recommender systems in the Web 3.0 scenario

Adaptation and Personalization for Web 2.0 ? AP-WEB2.0

http://ailab.dimi.uniud.it/en/events/2009/ap-web20/

This workshop will collocated with UMAP 2009, http://umap09.fbk.eu/, June 22-26, 2009, Trento, Italy - Paper submission deadline: March 30th, 2009

OBJECTIVES

This workshop aims at discussing the state-of-the-art, open problems, challenges and innovative research approaches in adaptation and personalization for Web 2.0; it provides a forum both for proposing innovative and open models, applications and new data sharing scenarios, as well as novel technologies and methodologies for creating and managing these applications.

Examples of stimulating application fields are social bookmarking environments, publication sharing systems, or, more in general, digital libraries.

Three specific questions motivate this workshop.

  1. How adaptation and personalization methodologies can augment Web 2.0 environments? And how can social adaptation mechanisms be evaluated?
  2. What models, techniques, and tools are the most adequate to better support Web 2.0 users?
  3. How much the introduction of tools for structuring personal user spaces (currently flat) can improve the creation and navigation processes and social awareness?

TOPICS

The topics of interest for the workshop are listed below. All of them have to be considered within the combined Web 2.0/Adaptation&Personalization perspective. Topics not explicitly listed below, which anyway adhere to the goals of the workshop, will be considered as well.

General

* Adaptation and personalization models and goals for social systems

* Modeling teams and groups in Web 2.0

Information Access and Extraction

 Advanced tools for information access in social networks

 Recommender Systems of new contents

 Personalized content ranking

 Social navigation support

 Social search and browsing

 Personal information Spaces

 Information extraction, opinion mining, and sentiment analysis

 Visualizing small-world/scale-free networks

 User concept spaces and maps

Sharing data and Knowledge

 Knowledge sharing

 Sharing user profiles in social networks

 User contribution

 Decentralized user modeling in social networks

Folksonomies and tagging

 Automatic tagging

 Ontology-based computer supported tagging

 User profile construction based on tagging and annotations

 Tag recommendation in social tagging systems

Analyzing UCG and social networks

 Social network analysis

 Content-based analysis of social network

 Modeling trust and reputation

 Metrics and key performance indicators for social network analysis

User Awareness

 Social awareness and visualization

 Personalized and adaptive views

 Motivating participation

 User identities in social systems: evolution and stigmergy impact

 Capturing and processing implicit and explicit feedback

 Trust-based recommendation

Evaluation

 Evaluation of community-based adaptation techniques

 Evaluation of social adaptation mechanisms

Intelligent Techniques for Web Personalization & Recommender Systems

Second CALL FOR PAPERS

7th Workshop on Intelligent Techniques for Web Personalization

& Recommender Systems

In conjunction with IJCAI 2009

July 11-17, 2009 - Pasadena, California, USA

http://maya.cs.depaul.edu/~mobasher/itwp09/

Submission Deadline: March 6, 2009

Web Personalization can be defined as any set of actions that can tailor the Web experience to a particular user or set of users. The experience can be something as casual as browsing a Web site or as (economically) significant as trading stocks or purchasing a car. The actions can range from simply making the presentation more pleasing to anticipating the needs of a user and providing customized and relevant information. To achieve effective personalization, organizations must rely on all available data, including the usage and click-stream data (reflecting user behaviour), the site content, the site structure, domain knowledge, as well as user demographics and profiles. Efficient and intelligent techniques are needed to mine this data for actionable knowledge, and to effectively use the discovered knowledge to enhance the users' Web experience. These techniques must address important challenges emanating from the size of the data, the fact that they are heterogeneous and very personal in nature, as well as the dynamic nature of user interactions with the Web. These challenges include the scalability of the personalization solutions, data integration, and successful integration of techniques from machine learning, information retrieval and filtering, databases, agent architectures, knowledge representation, data mining, text mining, statistics, information security and privacy, user modelling and human-computer interaction. 

Recommender systems represent one special and prominent class of such personalized Web applications, which particularly focus on the user-dependent filtering and selection of relevant information and ? in an e-Commerce context - aim to support online users in the decision-making and buying process. Recommender Systems have been a subject of extensive research in AI over the last decade, but with today's increasing number of e-commerce environments on the Web, the demand for new approaches to intelligent product recommendation is higher than ever. There are more online users, more online channels, more vendors, more products and, most importantly, increasingly complex products and services. These recent developments in the area of recommender systems generated new demands, in particular with respect to interactivity, adaptivity, and user preference elicitation. These challenges, however, are also in the focus of general Web Personalization research. 

In the face of this increasing overlap of the two research areas, the aim of this workshop is to bring together researchers and practitioners of both fields, to foster an exchange of information and ideas, and to facilitate a discussion of current and emerging topics related to "Web Intelligence".   

/var/folders/iB/iBzvNbuoE5S152hTDYenLk+++TI/-Tmp-/xwiki-office-importer-xwiki-XWiki.HolgerDick/input.tmp 13

Tags:
Created by on 2009/07/30 13:12

This wiki is licensed under a Creative Commons 2.0 license
XWiki Enterprise 2.7.1.${buildNumber} - Documentation