package OddMuse;

=head1 NAME

OddMuse::Searching::EvilDoers - Evil Functions

=head1 DESCRIPTION

This is a module of functions which do 'evil' thing to oddmuse.  Like totally
clobber existing oddmuse functions.  Some of these are really needed, and some
are here because I'm lazy.

=head1 DEPENDENCIES

L<OddMuse::Searching>

=cut

use OddMuse::Searching;

=head1 FUNCTIONS

=over 4

=item * GetSearchLink

Fix default Title Search, because quotes mess things up, seems to do a few odd
things with journal entries because of the _ to + translation.

=cut

sub GetSearchLink {
    my ( $text, $class, $name, $title ) = @_;
    my $id = UrlEncode( QuoteRegexp( $text ) );
    $name = UrlEncode( $name );
    $text = NormalToFree( $text );
    $id =~ s/_/+/g;    # Search for url-escaped spaces
  return ScriptLink( 'search=' . $id, $text, $class, $name, $title );
} ## end sub GetSearchLink

=item * SearchTitleAndBody

Totally re-implemented function which makes use of searching abilities of
OddMuse::Database.  This is really needed!  It does not yet know how to handle
uploaded files (yet) like the native OddMuse function does.

=cut

sub SearchTitleAndBody {
    my ( $searchtext, $func, @args ) = @_;
    my @found;

    my $searchresult = OddMuse::Searching->new( $searchtext );
    for my $pages ( keys %{ $searchresult } ) {
        if ( $searchresult->{ $pages }{ 'wikiname' } ) {
            push( @found, $searchresult->{ $pages }{ 'wikiname' } );
        }
    }
  return @found;
} ## end sub SearchTitleAndBody

=item * DoBrowseRequest

Replace Native Wiki Search Mechanism entirely, this is an ugly, ugly hack.  All
it really does is change DoSearch() into MetaSearch()

=cut

sub DoBrowseRequest {

    # We can use the error message as the HTTP error code
    ReportError( Ts( 'CGI Internal error: %s', $q->cgi_error ), $q->cgi_error )
      if $q->cgi_error;
    print $q->header( -status => '304 NOT MODIFIED' ) and return
      if PageFresh();    # return value is ignored
    my $id = GetId();
    my $action = lc( GetParam( 'action', '' ) );    # script?action=foo;id=bar
    $action = 'download'
      if GetParam( 'download', '' ) and not $action;    # script/download/id
    my $search = GetParam( 'search', '' );
    if ( $Action{ $action } ) {
        &{ $Action{ $action } }( $id );
    } elsif ( $action and defined &MyActions ) {
        eval { local $SIG{ __DIE__ }; MyActions(); };
    } elsif ( $action ) {
        ReportError( Ts( 'Invalid action parameter %s', $action ),
                     '501 NOT IMPLEMENTED' );
    } elsif ( ( $search ne '' ) || ( GetParam( 'dosearch', '' ) ne '' ) )
    {                                                   # allow search for "0"
        MetaSearch( $search );
    } elsif ( GetParam( 'title', '' ) and not GetParam( 'Cancel', '' ) ) {
        DoPost( GetParam( 'title', '' ) );
    } elsif ( $id ) {
        BrowseResolvedPage( $id );                      # default action!
    } else {
        ReportError( T( 'Invalid URL.' ), '400 BAD REQUEST' );
    }
} ## end sub DoBrowseRequest

=item * PrintJournal

Replace Journal call to call our page display properly with Smart Titles.
Again, an ugly hack.  All it does is call MetaPrintAllPages instead of
PrintAllPages.

=cut

sub PrintJournal {
  return if $CollectingJournal;                         # avoid infinite loops
    local $CollectingJournal = 1;
    my ( $num, $regexp, $mode, $offset, $search ) = @_;
    $regexp = '^\d\d\d\d-\d\d-\d\d' unless $regexp;
    $num    = 10                    unless $num;
    $offset = 0                     unless $offset;
    my @pages = (
                grep( /$regexp/,
                      $search ? SearchTitleAndBody( $search ) : AllPagesList() )
                );
    if ( defined &JournalSort ) {
        @pages = sort JournalSort @pages;
    } else {
        @pages = sort { $b cmp $a } @pages;
    }
    if ( $mode eq 'reverse' ) {
        @pages = reverse @pages;
    }
  return unless $pages[$offset];    # not enough pages
    my $max = ( $#pages < $offset + $num ) ? $#pages : ( $offset + $num - 1 );
    @pages = @pages[ $offset .. $max ];
    if ( @pages ) {

       # Now save information required for saving the cache of the current page.
        local %Page;
        local $OpenPageName = '';
        print $q->start_div( { -class => 'journal' } )
          . $q->comment( "$FullUrl $num $regexp $mode $offset" );
        MetaPrintAllPages( 1, 1, @pages );
        print $q->end_div();
    } ## end if ( @pages )
} ## end sub PrintJournal

=item * MetaPrintAllPages

Print Pages in collection with proper smart-titles if they exist, called by
both PrintJournal above, and DoCollect in our modfied calendar.pl

=cut

sub MetaPrintAllPages {
    my ( $links, $comments, @pages ) = @_;
    my $lang = GetParam( 'lang', 0 );
    @pages = @pages[ 0 .. $JournalLimit - 1 ]
      if $#pages >= $JournalLimit and not UserIsAdmin();
    use MLDBM qw( DB_File Storable );
    tie my %backhash, 'MLDBM', $datafile
      or die "Cannot open file $datafile $!\n";
    for my $id ( @pages ) {
        OpenPage( $id );
        my @languages = split( /,/, $Page{ languages } );
      next if $lang and @languages and not grep( /$lang/, @languages );
        my $title;
        if ( $backhash{ $id }{ 'title' } ) {
            $title = $backhash{ $id }{ 'title' };
        } else {
            $title = NormalToFree( $id );
        }

        print $q->start_div( { -class => 'page' } )
          . $q->hr
          . $q->h1( $links
                    ? GetPageLink( $id, $title )
                    : $q->a( { -name => $id }, $title ) );
        PrintPageHtml();
        if (     $comments
             and UserCanEdit( $CommentsPrefix . $id, 0 )
             and $id !~ /^$CommentsPrefix/ )
        {
            print $q->p( { -class => 'comment' },
                         GetPageLink( $CommentsPrefix . $id,
                                      T( 'Comments on this page' )
                                    )
                       );
        } ## end if ( $comments and UserCanEdit...
        print $q->end_div();
    } ## end for my $id ( @pages )
    untie %backhash;
} ## end sub MetaPrintAllPages

1;

__END__

=back

=head1 BUGS AND LIMITATIONS

No bugs have been reported.

Please report any bugs or feature requests to C<cmauch@gmail.com>

=head1 AUTHOR

Charles Mauch <cmauch@gmail.com>

=head1 LICENSE

Copyright (c) 2006 Charles Mauch

This program is free software; you can redistribute it and/or modify it under
the terms of the GNU General Public License as published by the Free Software
Foundation; either version 2 of the License, or (at your option) any later
version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY
WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
PARTICULAR PURPOSE.  See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with
this program; if not, write to the Free Software Foundation, Inc., 51 Franklin
Street, Fifth Floor, Boston, MA  02110-1301, USA.

=head1 SEE ALSO

perl(1).

=cut

# $Id: EvilDoers.pm 85 2006-10-07 07:54:35Z cmauch $
